Important note: Candidates with a background primarily in SQL and Informatica will not be considered.
We are looking for developers who are strong in coding and writing test cases. You will be part of the data team.
Responsibilities:
• Design, build, and manage reliable data pipelines and ETL/ELT workflows using Python.
• Ensure code quality through strong unit and integration testing practices.
• Work closely with data teams to gather requirements and automate deployment, monitoring, and troubleshooting.
• Improve system performance by optimizing data storage and resolving issues.
Requirements:
• Degree in Computer Science, Information Technology, or a related field.
• Minimum 5 years of hands-on experience in software or data engineering, with strong Python programming skills.
• Solid background in unit and integration testing.
• Understanding of DevOps best practices and Agile development methodologies.
• Strong problem-solving, software engineering, and communication skills.
• Team-oriented mindset with the ability to work independently when needed.
Nice to Have:
• Experience working with AWS cloud services and Kubernetes (K8s).
• Familiarity with modern data platforms like Snowflake, Apache Spark, or Hive.
• Experience using orchestration tools such as Apache Airflow, Dagster, or Prefect.
• Knowledge of GitHub workflows and monitoring tools like Datadog.
Original job Software Engineer /Python/ DevOps/ Apache tools/ posted on GrabJobs ©. To flag any issues with this job please use the Report Job button on GrabJobs.