You can be part of the team that enables significant service improvements for our people.
You're all about the detail and this role will allow you to drive evidence-based decision making to optimise systems and efficiency, not just for the business but for our customers too. It's data and research with impact you can see.
In this role you will
• Take on the role of Senior Data Engineer within the IT division, responsible for maintaining, modifying, improving, cleansing, and manipulating data from divisional and external databases.
• Create and maintain optimal data pipelines from diverse internal and external sources into a large cloud environment.
• Implement solutions for large datasets in alignment with reference designs, meeting functional requirements of clusters, branches, and consumers.
• Drive internal process improvements by automating manual processes, streamlining data delivery, and enhancing infrastructure scalability and efficiency.
• Ensure the secure virtualized separation of cloud environments and contribute significantly to data protection within secured perimeters.
• Employ DataOps framework for data ingestion and transformation, understanding data modelling concepts such as Star Schema and Normalization.
• Demonstrate proficiency in data integration using Azure services like Data Factory, Data Lake, Key Vault, and Databricks.
• Utilize SQL, Power BI, scripting languages, and Medallion Architecture concepts (Bronze, Silver, Gold) to facilitate effective data management.
• Leverage tools like Jira and Confluence to produce requirement artifacts such as Business Requirements, Functional Specifications, data flow diagrams, ERD diagrams, APIs, interface designs, test plans, and test cases.
For more information, please view a copy of the role description.
About you
• Proficiency in developing and maintaining efficient data pipelines from diverse sources into cloud environments.
• Hands-on experience with large-scale datasets and implementing solutions aligned with modern design standards.
• Proven ability to automate processes, streamline data delivery, and enhance infrastructure scalability.
• Familiarity with DataOps methodologies for agile data management and transformation.
• Sound knowledge of data modelling principles such as Star Schema and Normalization.
• Experience working with Azure services like Data Factory, Data Lake, Key Vault, and Databricks.
• Proficient in SQL, Power BI, and scripting languages for effective data manipulation and analysis.
• Understanding of Medallion Architecture concepts (Bronze, Silver, Gold) for data quality and governance.
• Capable of producing comprehensive requirement artifacts using modern collaboration tools like Jira and Confluence.
Who we are
Transport for NSW provides a safe, integrated and efficient transport system and services. We connect people, communities and industry. The work we do connects the journeys you take every day.
Join us
Our workforce is as diverse as the community we serve. If you’d like further information on our inclusion and diversity initiatives, visit Transport careers.
We offer a wide range of employee benefits, like our award-winning flexible and hybrid work options.
This role is hybrid-friendly, meaning you can mix in-person days at your team’s home base location with remote days.
What are you waiting for…? Connect with us. Apply now!
Applications close: 11:59 pm 18th February 2024
For more information about this role, please contact Srivatsa.Lakshminarayana@transport.nsw.gov.au
People living with disability are supported throughout the recruitment process and at work. Visit Supporting people with disability for more info or speak to your talent team member to arrange any adjustments to how you interact with us.
Need some help with your application? Take a look at our application tips video series
#LI-Hybrid