Key Responsibilities
• Design, develop, and maintain end-to-end data pipelines on GCP for banking applications.
• Build real-time and batch data processing solutions using GCP services (Dataflow, Dataproc, Pub/Sub).
• Develop and optimize data warehouses, data lakes, and lakehouse solutions using BigQuery and Cloud Storage.
• Ingest, cleanse, and transform structured and unstructured banking data for analytics and reporting.
• Work on ETL/ELT pipelines leveraging Cloud Composer / Apache Airflow.
• Ensure data security, privacy, and compliance with banking regulations (AML, Basel, GDPR, PCI-DSS).
• Collaborate with business analysts, architects, and data scientists to enable data-driven decision making.
• Implement real-time data streaming solutions for payments, fraud detection, and transaction monitoring.
• Perform data modeling, query optimization, and performance tuning.
• Provide production support and resolve incidents efficiently.
• Mentor junior engineers and enforce data engineering best practices on GCP.
Required Skills & Qualifications
• Bachelor’s/Master’s degree in Computer Science, Engineering, or related field.
• 8–12 years of IT experience, with at least 4+ years in GCP Data Engineering.
• Strong expertise with BigQuery, Dataflow (Apache Beam), Dataproc (Hadoop/Spark), Pub/Sub, Cloud Storage, Cloud Composer.
• Proficiency in SQL, Python, PySpark, and data modeling techniques.
• Experience with ETL/ELT processes, data pipelines, and orchestration frameworks.
• Knowledge of banking/financial services domain (Core Banking, Payments, Risk, Compliance, AML/KYC).
• Familiarity with DevOps, CI/CD pipelines, Git, and containerization (Docker/Kubernetes).
• Strong understanding of data governance, lineage, and security frameworks.
• Excellent problem-solving, debugging, and communication skills.
Job Type: Full-time
Pay: $90,000.00-$130,000.00 per year