Apply for Job
Senior Data Engineer
Ding! Ding! Trams are the heartbeat of Melbourne – come be part of our team on the largest and most iconic tram network in the world.
About the Opportunity
We are seeking a highly capable Senior Data Engineer to join our Data and Integration team. This is a 2‑year fixed term role reporting to the Team Manager, Data and Integration. You will work across business and technical teams to build scalable pipelines, enhance data models, improve data quality, and support analytics tools that enable insights across Yarra Trams.
You will work across multiple Azure data services, ensuring the efficient movement, transformation and availability of data that supports operational performance, contract SLAs and broader organisational objectives. This is an opportunity to work with modern cloud technologies and contribute directly to the data capabilities that underpin Melbourne’s iconic tram network.
What You Will Be Doing
- Building and maintaining scalable data pipelines within Azure Data Factory.
- Transforming, ingesting, and deploying large, complex datasets.
- Improving internal data processes including automation and optimisation.
- Supporting and maintaining data analytics tools and reporting capabilities.
- Working with stakeholders to resolve data-related issues and support platform needs.
- Building and maintaining data transformations, data structures, and metadata processes.
- Ensuring data security classifications and controls are adhered to.
- Delivering work to agreed timelines, budgets and scope.
- Conducting detailed analysis and configuration of Azure data solutions.
- Supporting database maintenance, security access, audit reporting and application performance monitoring.
- Participating in incident and problem management as required.
What You Will Bring
- Minimum 3 years’ experience in a data engineering role.
- Expert SQL capability and experience with relational databases.
- Strong experience with Azure Data Factory and enterprise-scale data pipelines.
- Experience with Azure Data Lake, data marts and Erwin modelling tools.
- Proven experience building and optimising cloud data architectures in Azure.
- Experience with ELT and ETL for large datasets (terabyte scale).
- Strong background in data quality, data cleaning and enrichment processes.
- Analytical skills and ability to work with unstructured datasets.
- Strong communication skills and the ability to work with stakeholders across the business.Experience supporting BI solutions and data management strategies.
Applications close:
26 April 2026