Snowflake data engineer

Vancouver 8 hours agoContractor External
Negotiable
Company Description HabileLabs is a global partner for Cloud, Data, and AI, dedicated to assisting organizations in achieving intelligent transformation. As an AWS Advanced Tier Partner with Generative AI and SMB competencies, HabileLabs specializes in building scalable, secure, and cost-efficient platforms to drive long-term growth. With proven expertise in cloud migration, data engineering, and AI solutions, we help organizations across industries such as BFSI and Healthcare meet performance, compliance, and data reliability needs. By delivering resilient and future-ready solutions, we empower businesses to thrive in the evolving digital landscape. Role Description As a Snowflake Data Engineer, you will be responsible for designing and implementing Snowflake data solutions, developing and maintaining data pipelines, and optimizing data models. You will work collaboratively with cross-functional teams to engineer efficient data workflows, ensure data security and integrity, and provide support for data analytics needs. Qualifications The Snowflake Developer / Data Modeler is a key member of our data engineering team, responsible for designing, developing, and optimizing scalable data solutions on the Snowflake platform. This role focuses on building high-performance SQL-based transformations, robust data pipelines, and well-structured data models that enable analytics, reporting, and data-driven decision-making. The ideal candidate brings deep expertise in Snowflake, SQL performance optimization, data modeling, and cloud-based ETL/ELT tools, combined with strong analytical and collaboration skills. What You'll Do Snowflake Development Design, develop, and maintain Snowflake objects such as warehouses, databases, schemas, tables, views, stages, file formats, tasks, and streams. Build and manage Snowflake pipelines, including Snowpipe, ingestion processes, and continuous data flows. Implement role-based access controls (RBAC) and ensure data security best practices. SQL Development & Performance Optimization Write, optimize, and troubleshoot complex SQL queries. Improve query performance through indexing strategies, clustering, caching, pruning, and warehouse optimizations. Conduct performance tuning of ETL/ELT jobs and Snowflake compute resources. Data Modeling & Validation Develop scalable data models: conceptual, logical, and physical. Work with dimensional modeling techniques (Star/Snowflake schema). Ensure data quality through comprehensive validation, profiling, and reconciliation. Cloud ETL/ELT Integration Use at least one cloud ETL/ELT platform (e.g., Informatica Cloud, Matillion, ADF, Talend, DBT, AWS Glue, etc.) to ingest and transform data from diverse sources. Manage automated workflows, data pipelines, and schedulers. Architecture & Best Practices Understanding of data architecture concepts including data lakes, data warehouses, ingestion patterns, and transformation frameworks. Contribute to architectural discussions and recommend Snowflake best practices, new features, and process improvements. Collaborate with Data Engineers, Analysts, and Architects to build end-to-end data solutions What You Bring Strong experience in Snowflake Data Cloud development. Advanced SQL development skills. Deep knowledge of performance tuning and optimization. Hands-on experience in data modeling and validation. Experience working with one or more cloud ETL/ELT tools. Familiarity with Snowflake features like Time Travel, Cloning, Streams, Tasks, Snowpipe, and Resource Monitors. Knowledge of cloud environments (AWS/Azure/GCP). Experience in data architecture design principles. Exposure to DBT or similar transformation frameworks. Knowledge of DevOps practices (CI/CD pipelines, Git). Understanding of orchestration tools like Airflow or ADF pipelines. Seniority level Mid-Senior level Employment type Contract Job function Information Technology Industries IT Services and IT Consulting