Requirements (Mandatory)
• 10+ years in data industry performing solutioning, architecting and engineering tasks.
• 3+ years on databricks solution implementation.
• Proven consulting/services experience and ability to present solutions to senior client stakeholders.
• Hands‑on experience in leading a large data platform implementations
• Deep expertise in Databricks with lakehouse architectures, MLflow, Delta Lake.
• Strong cloud concepts and architecture knowledge (AWS).
• Data modelling, Spark, SQL, ETL/ELT, and orchestration tools.
• Understanding of data governance with Unity Catalog, security, and modern data platform design.
Requirements (Nice to have)
• Knowledge of AI/ML requirements and the ability to work with AI/ML experts.
• Pre‑sales experience including Use case definition, POC creation and RFP responses.
• Knowledge of MLOps, data observability, DevOps/CI-CD, and BI tools.
• Databricks or AWS certifications.
• Excellent communication, client‑facing abilities, and solution articulation.
• Hands‑on, consultative, and able to lead teams and delivery.
Job Responsibilities
• Lead end‑to‑end data platform implementations on Databricks, ensuring scalable and high‑quality delivery.
• Align data architecture with AI/ML initiatives and guide clients on data readiness for AI.
• Mentor and develop data engineering and architecture teams.
• Contribute to internal Solution Review Board with the best practices, reference architectures, and reusable assets