Salary: £45,000 - 55,000 per year Requirements: 3–5 years experience in Data Engineering, with strong exposure to ETL/ELT pipelines Strong SQL and Python skills, with hands-on experience building data pipelines Experience with DBT, Terraform, and version control (Git) Exposure to Airflow (or similar orchestration tools), Docker, and CI/CD practices Cloud experience in GCP (BigQuery preferred) or Azure/AWS environments Understanding of modern data concepts including Data Mesh, Agile delivery, and test-driven development Strong problem-solving skills with the ability to work independently and within a collaborative team Responsibilities: Design, build, and maintain scalable data pipelines and data products using modern ELT principles Work closely with product managers, architects, and engineers to deliver data solutions aligned to business needs Contribute across the full data product lifecycle, from design and development through to deployment and optimisation Ensure high-quality, well-documented code and maintain strong engineering standards Support CI/CD processes, environment management, and deployment pipelines Translate technical challenges into clear, structured solutions for both technical and non-technical stakeholders Technologies: Airflow AWS Azure BigQuery CI/CD Cloud Docker ETL GCP Git Support Python SQL Terraform dbt Composer More: We are a growing, data-driven organization that is building out a modern Data Platform following a successful migration from on-premise to Google Cloud. Our well-established Data Office is focused on developing a scalable Data Mesh architecture. This is a strong opportunity for a mid-level Data Engineer to join a collaborative, product-led environment in which you will work with modern tooling and contribute to the end-to-end delivery of data products. This role is remote with the expectation to be in London, Leeds, or Preston 1-2 days per month. last updated 17 week of 2026