Data Engineer (Python, DBT, Snowflake) – Join a Fast-Growing AI Tech Scale-Up!
Are you a passionate Data Engineer with deep Python skills and a love for building robust data pipelines? Want to be at the forefront of innovation, helping shape the future of AI-driven platforms? This is your opportunity to make a real impact.
We’re partnering with a high-growth London-based technology scale-up that's investing heavily in its data capabilities. With a greenfield data platform underway, this is your chance to join at the beginning of an exciting journey.
You’ll be instrumental in defining, building, and optimising the company’s new ML/AI data infrastructure. Working alongside engineering and product stakeholders, you'll help shape and scale the data function from the ground up.
What You’ll Do
* Architect and implement modern, scalable data pipelines using Python, Airflow, and DBT.
* Design high-quality, governed data models to support a fast-moving AI product team.
* Collaborate with data scientists and ML engineers to deliver clean, reliable data.
* Build out a cloud-native infrastructure on Azure using Terraform and other IaC tools.
* Drive best practices across data quality, testing, observability, and CI/CD.
* Influence technical decisions and contribute to the long-term vision for the platform.
What You’ll Bring
* Strong hands-on engineering skills in Python (clean, production-grade code is a must)
* Experience working with DBT and orchestrating data workflows with Airflow
* Deep SQL knowledge and experience working with Snowflake (or similar)
* Proven ability to work in fast-paced environments (scale-up/start-up experience a plus)
* Passion for collaborating across teams and a strong interest in AI/ML platforms
* A degree in Computer Science or a related technical field (or equivalent experience)
If interested, please apply for further details & a confidential discussion.