Data Engineer (Snowflake / Python / SQL / dbt)
Location: London (2–3 days onsite/week)
Contract: 12 months (likely extension)
Overview
Seeking a Data Engineer with strong cloud data platform experience to support the build, optimisation, and delivery of scalable data solutions within a global financial services environment. Hands-on role covering engineering, transformation pipelines, data quality, and stakeholder engagement.
Responsibilities
* Design, build, and maintain scalable data pipelines and ETL/ELT workflows
* Develop and optimise data models within Snowflake
* Build and manage transformation layers using dbt
* Write efficient SQL for data extraction, transformation, and reporting
* Develop Python-based automation and data processing solutions
* Monitor pipeline performance, data quality, and system reliability
* Collaborate with analysts, product, and engineering teams on data requirements
* Provide clear stakeholder updates on delivery progress and risks
* Support Agile delivery practices (Scrum, Jira)
Skills
* Strong Snowflake engineering and performance optimisation
* Advanced SQL development and data modelling
* Python for data engineering, automation, and scripting
* Hands-on dbt experience (models, testing, documentation)
* ETL / ELT pipeline development experience
* Experience within financial services / regulated environments preferred
* Cloud platform exposure (AWS / Azure / GCP beneficial)
* Agile experience & stakeholder management