Job Description
We are looking for a skilled Data Engineer to join our team in Frankfurt, Germany. As a key member of our data engineering team, you will be responsible for designing, building, and maintaining scalable data pipelines using Azure Data Services, Databricks, and/or Microsoft Fabric.
Key Responsibilities
• Design and build scalable data pipelines using Azure Data Services, Databricks, and/or Microsoft Fabric.
• Consolidate structured/unstructured data into governed lakes and warehouses for BI and AI use cases.
• Implement robust data models and storage architectures (Star, Snowflake, Medallion).
• Ensure data integrity and quality, lineage, security, and governance across the full lifecycle.
• Automate workflows using Azure DevOps, GitHub Actions, or other CI/CD tools.
• Collaborate in client workshops, translating requirements into technical Azure-native solutions.
• Optimize performance and cost efficiency of the data infrastructure.
Qualifications
To be successful in this role, you will need:
• 3+ years of real-world project experience as a Data Engineer in Azure ecosystems.
• Fluency in German and English.
• Advanced SQL and performance tuning.
• Strong background in dimensional data modeling and familiarity with ETL patterns, like medallion architecture.
• Hands-on experience with Azure Data Factory, Synapse, Databricks, and ideally Microsoft Fabric.
Preferred Qualifications
The following qualifications are not mandatory but highly desirable:
• Proficiency in Microsoft Power BI.
• Exposure to Data Mesh or domain-oriented data architectures.
• Experience with Delta Lake, Unity Catalog, or Feature Store.