Key Responsibilities:
• Lead the overhaul of the company's data infrastructure and create a scalable architecture for analytics and reporting.
• Collaborate with business stakeholders to convert data needs into reporting and analytical solutions.
• Mentor junior data engineers, promoting best practices and continuous improvement.
• Design and maintain efficient data pipelines and warehouse solutions for timely access to critical information.
• Integrate data from various platforms (ERP, CRM, e-commerce) for sales reporting and performance tracking.
• Ensure data governance, reliability, and security while protecting sensitive information.
• Monitor system performance and optimize for scalability and efficiency.
• Develop data models, dashboards, and BI tools for actionable insights.
Requirements:
• * Bachelor’s in Computer Science, Data Science, or related field.
• 8+ years in data engineering with experience in large-scale data infrastructure.
• Expertise in Data Lake and Warehouse architecture.
• Proficient in Databricks, Snowflake, Amazon Redshift, and Google BigQuery.
• Experience in batch and streaming data processing.
• Strong SQL and Python skills; knowledge of Scala or R is a plus.
• Familiarity with relational and NoSQL databases (PostgreSQL, MongoDB).
• Knowledge of retail / e-commerce data flows.
• Experience with Spark, Hive, Hadoop, or EMR is a plus.
• Knowledge of BI platforms (AWS QuickSight, Power BI, Tableau).
• Proven ability to build and scale ETL/ELT pipelines.
• Strong analytical and problem-solving skills.
• Fluent in English; Cantonese and Mandarin proficiency is an advantage.