Position Overview
Primary Title: Data Engineer (Remote, United States).
Industry Sector: Financial services — investment risk analytics, portfolio engineering, and enterprise data platforms. We build scalable data infrastructure and analytics pipelines that power risk signals, regulatory reporting, and client-facing analytics for institutional customers.
We are recruiting a remote Data Engineer to join a high-performance engineering team focused on operationalizing large-scale ETL/ELT and streaming data solutions. You will design, implement, and operate resilient data pipelines and platform components that deliver timely, accurate analytics for trading, risk, and reporting use-cases.
Key Responsibilities
• Design, build, and maintain scalable batch and streaming data pipelines to ingest, transform, and deliver high-quality datasets for analytics and ML.
• Author and optimize reusable ETL/ELT workflows using managed orchestration (e.g., Airflow) and Spark-based compute for performance and cost-efficiency.
• Implement and maintain cloud data platform components (data warehouses, storage, access controls) to support ad-hoc analytics and production reporting.
• Collaborate with data scientists, analysts, and SREs to define data schemas, validation rules, monitoring, and SLAs for production datasets.
• Drive data engineering best practices: modular code, CI/CD pipelines, automated testing, observability, and infrastructure-as-code.
• Troubleshoot production incidents, perform root-cause analysis, and implement long-term reliability improvements.
Required Qualifications
• Python
• SQL
• Apache Spark
• Apache Airflow
• Snowflake
• AWS
Proven experience building production data pipelines for analytics or risk workflows; strong troubleshooting and system-design ability; familiarity with data governance, lineage, and observability practices. Candidates should be authorized to work in the United States.
Preferred Qualifications
• dbt
• Apache Kafka
• Terraform
Benefits & Perks
• Fully remote, US-based role with flexible work policies and distributed engineering teams.
• Focus on professional growth: technical mentorship, learning budget, and opportunities to influence platform design.
• High-impact environment where engineering ownership and data quality drive business outcomes.