The role involves:
- Building robust batch and streaming data pipelines for production-grade data products/platforms
- Building and maintaining both cloud and on-premise data infrastructure
- Collaborating with other stakeholders - audit innovation solution designers/architects, central team
• *Job Requirements:
- 2 - 4 years of experience in ETL, data pipeline building, and data warehousing, able to demonstrate and elaborate on past use cases/projects in CV.
- Experience in extracting and working with data from ERP systems such as SAP/Oracle.
- Exposure to Database technologies such as Microsoft SQL/PostgreSQL and experience with platforms (Azure/AWS) will be an advantage.
- Advanced understanding of database principles, security, and administration
- Familiar with SQL, dimensional modelling, data warehousing, and data integration
- Familiar with system architecture design and analysis
- Fluency in more than one language, including Python
- Ability to pick up new languages and technologies quickly.
- Have a positive attitude, willingness to learn and take on responsibilities that may be outside the scope of your current skills
- Strong problem-solving skills, curiosity and passion for technology
- Strong co-ordination and time management skills to handle complex projects and meet project due dates
- Strong communication skills and a good team player