Principal Data Analytics Engineer

New York 3 days agoFull-time External
Negotiable
About the Role The Data & Analytics team is dedicated to building innovative data products that provide actionable insights, empowering our partners to succeed. We focus on curating and maintaining key data sources and statistics that serve both internal and external stakeholders. • *What You’ll Do Advanced Engineering & Coding** • Core Development: Design and code complex dbt models and data transformation logic for high-volume financial datasets (trade transactions, order books). • Python Automation: Write production-grade Python scripts for advanced data processing, anomaly detection, and custom orchestration logic that SQL cannot handle alone. • Performance Engineering: Take ownership of the "hardest problems" regarding query performance. Refactor legacy code and optimize incremental loading strategies to reduce costs and latency at scale. Technical Architecture & Standards • CI/CD & DevOps: Own the technical implementation of our data deployment reporting pipelines (Git, dbt Cloud), ensuring robust version control and seamless integration • Data Quality as Code: Engineer automated testing frameworks and validation suites (using dbt tests/Python) to catch data issues before they reach the business layer Mentorship & Collaboration • Code Reviews: Provide Code reviews for Junior and senior developers code • Technical Guidance: Act as the "go-to" technical resource for Senior and Junior engineers when they hit blockers, helping them solve code-level problems through pair programming and guidance • Cross-Functional Impact: Work across technical and business groups to problem-solve, consulting on impacts of future engineering changes to modeling outputs What You’ll Need Technical Mastery • 6-12+ years of hands-on experience in data engineering or analytics engineering. • Expert SQL: You don't just write queries; you understand execution plans, partition pruning, and how to optimize for compute-heavy environments. • Advanced dbt: Deep experience with dbt internals, custom materializations, macros, and package management. • Strong Python: Proficiency in Python for data manipulation (Pandas/Polars) and interaction with APIs/AWS services (boto3). • Financial Data Fluency: Experience architecting data models for complex financial instruments, ledgers, or high-frequency transaction data. • KPI Architecture: Experience building business metrics. ensure metric definitions are version-controlled, reusable, and mathematically consistent across all downstream reports. Engineering Mindset • Platform Thinking: You build solutions that are reusable and modular, not one-off scripts. • Production Grade: Experience treating data as software (Unit testing, CI/CD, Documentation, SLA monitoring). • Problem Solving: Ability to independently diagnose cryptic error messages in distributed systems (Spark/Databricks/Snowflake) and resolve them. Applicants must be authorized to work for any employer in the U.S. DriveWealth is unable to sponsor an employment Visa at this time. Special Knowledge (Nice to Have, But Not Required) • Experience with Apache Spark or Databricks for heavy compute workloads. • Experience implementing Airflow or similar orchestrators. • Experience with Sigma Computing (from a data modeling perspective). • Experience using AI/LLM tools to enable faster, smarter analytics workflows