GCP Data Engineer

Chicago 29 days agoFull-time External
632.5k - 913.6k / yr
We are seeking a GCP Data Engineer with hands-on experience with Google Cloud Platform (GCP) services, particularly BigQuery, and be proficient in DBT (Data Build Tool) for data transformation and modeling. This role requires strong SQL skills, a solid understanding of data warehousing concepts, and light proficiency in Python for scripting and automation. Key Responsibilities: • Design, build, and maintain scalable data pipelines on GCP, primarily using BigQuery. • Develop and manage DBT models to transform raw data into clean, tested, and documented datasets. • Write complex and optimized SQL queries for data extraction, transformation, and analysis. • Implement and maintain data warehousing solutions, ensuring performance, scalability, and reliability. • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. • Monitor and troubleshoot data pipeline performance and data quality issues. • Automate data workflows and tasks using Python scripts where necessary. • Ensure data governance, security, and compliance standards are met. Required Qualifications: • Bachelor's degree in Computer Science, Information Systems, or a related field. • 4-5 years of experience in data engineering or a similar role. • Strong expertise in SQL with the ability to write efficient, complex queries. • Proficiency in DBT for data modeling and transformation • Hands-on experience with BigQuery and other GCP data services. • Solid understanding of data warehousing principles and best practices. • Basic to intermediate skills in Python for scripting and automation. • Familiarity with version control systems like Git. • Excellent problem-solving and communication skills.