Role: Data DevOps Engineer
Location: Foster City, CA
W2 Contract
Onsite
This role is open for candidates on W2 & need locals only.
Responsibilities:
• Design, build, and maintain a platform used by Client teams to build large-scale data pipelines
• Drive platform efficiency improvements to reduce latency and improve data freshness
• Support data-driven engineering decisions through improved observability and key metrics
• Partner with engineering teams to support the long-term scaling of their data pipelines
Qualifications:
• 5+ years of software engineering experience
• Strong fluency with Python
• Strong understanding of distributed systems
• Strong written and verbal communication skills
• Good experience building and maintaining data-intensive production systems at scale
• Good experience with relational databases and non-relational (NoSQL) databases
• Good experience with infrastructure-as-code tools (e.g. Terraform, )
• Experience with large-scale streaming platforms (e.g. Kafka, Kinesis)
• Experience with data warehouse platforms (e.g. Redshift, BigQuery, Databricks,)
• Experience building scalable and maintainable data pipelines
• Experience with AWS ECS and/or Kubernetes
• Experience with a workflow manager such as Airflow
• Experience with large-scale processing frameworks (e.g. Spark, Hadoop