Dice is the leading career destination for tech experts at every stage of their careers. Our client, Tekcel8, is seeking the following. Apply via Dice today!
Job Summary
We are seeking an experienced Senior Data Engineer with 8+ years of hands-on experience designing and delivering large-scale data platforms. The ideal candidate will lead the design of robust data pipelines, drive data architecture decisions, and mentor junior engineers while working across AWS and Azure cloud environments.
Key Responsibilities
• Architect, design, and implement scalable, fault-tolerant data pipelines (batch and streaming)
• Lead development of ETL/ELT frameworks using Python, SQL, Spark/PySpark
• Design and optimize data lakes, lakehouses, and data warehouses
• Build and manage cloud-native data solutions on AWS and Azure
• Optimize performance, cost, and scalability of data platforms
• Define and enforce data engineering standards, best practices, and governance
• Ensure data quality, lineage, security, and compliance requirements
• Collaborate with data architects, analysts, data scientists, and business stakeholders
• Mentor junior data engineers and conduct code reviews
• Troubleshoot complex data issues and drive root-cause analysis
• Support CI/CD pipelines and infrastructure-as-code for data systems
Required Qualifications
• Bachelor s degree in Computer Science, Engineering, or related field (or equivalent experience)
• 8+ years of experience in data engineering or related roles
• Expert-level proficiency in Python and SQL
• Strong hands-on experience with Apache Spark / PySpark
• Extensive experience with AWS (S3, Glue, Redshift, EMR, Lambda, Kinesis, etc.)
• Extensive experience with Azure (ADF, Synapse, Databricks, ADLS, Event Hubs, etc.)
• Deep understanding of data modeling and warehouse design
• Experience with orchestration tools (Airflow, Prefect, Azure Data Factory)
• Strong experience with relational, NoSQL, and big data technologies
Preferred Qualifications
• Experience with Databricks Lakehouse architecture
• Experience with real-time / streaming data pipelines
• Hands-on experience with Docker, Kubernetes, and containerized data workloads
• Experience with Terraform or cloud-native IaC tools
• Exposure to data governance, metadata management, and catalog tools
• Experience working in Agile/Scrum environments
Nice to Have
• Experience supporting enterprise BI platforms (Power BI, Tableau, Looker)
• Knowledge of machine learning data pipelines
• Prior experience in large-scale or regulated enterprise environments
• Strong communication and stakeholder management skills