Senior Data Engineer with Python , Snowflake -Onsite @NYC-Need Locals Pnly

New York 11 days agoContractor External
Negotiable
Title - Senior Data Engineer Location-Onsite @NYC-Need Locals Duration Contract Design, build, and deployment of data pipelines and backend services, Snowflake data modeling, and strong data engineering experience. Job Summary We are seeking a Senior Data Engineer to design, build, and deploy scalable data pipelines and backend data services. The ideal candidate will have strong hands-on experience in Snowflake data modeling, modern data engineering practices, and building reliable, high-performance data platforms that support analytics and business intelligence. Key Responsibilities • Design, build, and deploy end-to-end data pipelines for large-scale data processing. • Develop and maintain backend data services and APIs to support data consumption. • Perform Snowflake data modeling, including schema design, optimization, and performance tuning. • Implement ELT/ETL processes for structured and semi-structured data sources. • Optimize data storage, query performance, and cost efficiency in Snowflake. • Ensure data quality, reliability, security, and governance across pipelines. • Collaborate with Data Scientists, Analysts, and Business stakeholders to understand data needs. • Implement monitoring, logging, and alerting for data pipelines. • Support CI/CD pipelines for data engineering deployments. • Document data flows, architectures, and best practices. Required Skills & Qualifications • 10+ years of experience in Data Engineering or related roles. • Strong hands-on experience with Snowflake (data modeling, performance tuning, security). • Expertise in SQL and data transformation techniques. • Proficiency in Python and/or Java for backend data processing. • Experience building scalable ETL/ELT pipelines. • Knowledge of data orchestration tools (Airflow, Azure Data Factory, or similar). • Experience with cloud platforms (AWS, Azure, or Google Cloud Platform). • Familiarity with CI/CD, Git, and DevOps practices. • Strong problem-solving and communication skills. Nice to Have • Experience with streaming technologies (Kafka, Kinesis). • Exposure to dbt or modern transformation frameworks. • Experience in large-scale enterprise or regulated environments. • Knowledge of data governance and metadata management tools. Regards, Sai Srikar Email: