Citylitics delivers predictive intelligence on local utility & public infrastructure markets
It is the roadways you rely on to safely get to Grandma's house, it's the potable water that comes out of your kitchen tap that you wash your family's food with and it's the energy that heats our homes and powers our digital lifestyles.
Every year, trillions of dollars are spent on all areas of infrastructure to maintain our quality life and move our economy forward. We hear about infrastructure failures, whether bridge collapses, power blackouts, or water main breaks, every day in the news. Climate change and extreme weather events are disrupting the basic infrastructure we took for granted for years.
Citylitics is solving the hardest data problems in infrastructure while building the sales intelligence platform that enables a faster, more transparent, and more efficient infrastructure marketplace. We turn millions of unstructured documents into high value intelligence feeds and datasets that are available on an intuitive user experience. Our goal is to enable solution providers to connect with cities with relevant infrastructure needs in a faster and more digital way than historic market channels. We are a dynamic and growing data engineering team responsible for building and maintaining the data infrastructure and pipelines that power Citylitics' data-driven decisions. We're passionate about data quality, automation, and empowering our stakeholders with insightful data visualizations. We work in a collaborative environment that encourages learning and growth, and we're looking for a motivated Junior Data Engineer to join our ranks!
As a Junior Data Engineer, you will play a key role in developing and maintaining our data pipelines and dashboards. You will work closely with senior engineers to design, implement, and test data solutions using a variety of tools and technologies.
Your primary focus will be on building interactive and informative dashboards using Dash and Plotly - you will also be contributing to the development of our Airflow-based data pipeline infrastructure. This role offers a great opportunity to learn and grow in a fast-paced environment, contributing to impactful projects and expanding your data engineering skills.
Contribute to the development and maintenance of complex data pipelines using Apache Airflow.
Implement data quality checks and monitoring to ensure data accuracy and reliability.
Collaborate with senior engineers on the design and implementation of new data solutions.
Work with stakeholders to understand their data needs and translate them into actionable dashboards and reports.
Assist in the migration and integration of data from various sources.
Contribute to the documentation and maintenance of our data infrastructure.
Explore and learn new technologies and tools within the data engineering landscape.
Python, Django, Cloud SQL and Airflow/Cloud Composer as the main language, web framework, database and orchestration tool respectively
At least 1 year experience with Python, Dash & Plotly
Experience with Google Cloud Platform and Docker is an asset
Understanding of data modeling concepts and best practices.
Opportunity to work for one of the top 15 innovative analytics startups in Canada revolutionizing data intelligence
You get to support a disruptive solution with a compelling value proposition into an industry that is eager to hear from you and in a market with no direct competition.
We live at the cross section of infrastructure, scaleup and data science/AI. There is no corporate bureaucracy here. You will accomplish more here in a few months than what you would in a few years at a large, entrenched technology company.
We believe that Data and AI will play an outsized role in our future, so we equip every team member with access to Generative AI tools and our full Data Universe to enhance their productivity and encourage innovation through experimentation.
We are committed to making diversity and inclusivity part of our culture!