Responsibilities
• Work with Head of Data and AI in applications-related projects and solutioning activities and ensure successful delivery and support of the project.
• Involve in the discussion with stakeholders on business application enhancements, to understand business requirements, participate in developing an effective and efficient solution.
• Involve in the technology related solution and delivery of projects for Bank applications
• Provides support to System Integration Testing (SIT) and User Acceptance Testing (UAT), prior to production implementation
• Schedule and support production maintenance activities, such as software deployments, security patches and operating system patches
• Attend to production problems on timely manner and escalate to management for their attention when necessary
• Work closely with external vendors and internal IT partners effectively in carrying out required assignments
• Compliance with external/internal regulatory requirements, internal control standards and group compliancy policy.
• Contribute to proper application documentation to build the knowledge assets for the Bank
• Participate actively in Department and Bank’s initiative and activities
• Design and maintain efficient data pipeline architectures to ensure seamless data flow and processing.
• Compile and manage extensive, intricate data sets that fulfill both functional and non-functional business requirements.
• Identify, design, and implement internal process enhancements by automating manual tasks, optimizing data delivery, and re-engineering infrastructure for improved scalability.
• Develop and maintain the infrastructure necessary for optimal data extraction, transformation, and loading (ETL) from diverse data sources using SQL and AWS big data technologies.
• Create advanced analytics tools that leverage the data pipeline to deliver actionable insights into customer acquisition, operational efficiency, and other critical business performance metrics.
• Collaborate with stakeholders including Executive, Product, Data, and Design teams to address data-related technical issues and support their data infrastructure needs.
• Ensure data security and compliance by maintaining data separation and security across multiple data centers
• Develop data tools for the analytics and data science teams to aid in building and optimizing our product, positioning it as an innovative industry leader.
• Work closely with data and analytics experts to enhance the functionality and performance of our data systems.
Requirements
• University Bachelor's Degree preferably majoring in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field.
• Professional certification on AWS Cloud is a plus
• Min. 10 years of experience in Data Engineering / Data Management
• Knowledge in application management and project SDLC, ITIL processes in a banking environment
• Experience in support and troubleshooting for Production and UAT/SIT (coordination and hands-on
• Proven experience in building and optimizing big data pipelines, architectures and data sets
• Good sense in new technologies landscape
• Good communication skill and ability to present technical proposals to internal IT stakeholders
• Possess following tools and technologies knowledge:
• Big data tools: Hadoop, Spark, Kafka, etc.
• Relational SQL and NoSQL databases: Postgres, Cassandra, etc.
• Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
• AWS cloud services: EC2, EMR, RDS, Redshift, Iceberg
• Stream-processing systems: Storm, Spark-Streaming, etc.
• Object-oriented and functional programming languages: Python, Java, C++, Scala, etc.