Your newpany
You’ll be joining a large, well-established UK financial services organisation operating within a highly regulated environment. The business is currently delivering a criticalmissions Redress programme, placing data quality, regulatory reporting and delivery pace at the centre of its transformation agenda.
Your new role
We are seeking an experienced Data Engineer / Data Modeller to join a high‑profilemissions Redress Project within the Data team.
This is a Databricks‑led role, requiring deep, hands‑on expertise in Databricks, Azure Data Factory (ADF) and SQL, alongside strong data modelling capabilities. You will analyse, design and deliver data solutions aligned to clearly defined user stories, ensuring data structures are fit for operational consumption, regulatory reporting and analytical insight.
You will work closely with business stakeholders to define acceptance criteria, design schemas and models, and deliver high‑quality, auditable data outputs. This is not a purely technical role — success is measured by business value delivered, pace of delivery, and your ability to operate effectively within ambiguity.
You will also play an active role in mentoring and upskilling less experienced team members, contributing positively to an agile, collaborative culture.
What you'll need to succeed
Essential (Extensive Hands‑On Experience Required):
1. Expert‑level Databricks experience (data engineering, transformations, performance optimisation)
2. Strong experience using Azure Data Factory (ADF) for orchestration
3. Advanced SQL skills plex queries, stored procedures, data transformations)
4. Proven data modelling expertise (schemas, measures, calculations, KPIs, Critical Data Elements)
5. Experience designing and building aggregated master data files
6. Strong understanding of Azure Data Lake architectures and dimensional modelling
7. Experience delivering operational and regulatory reporting using Databricks and Power BI
Desirable:
8. Working knowledge of Power BI Report Builder
9. Familiarity with tools such as Power Automate, Power Pivot, SharePoint, Teams
10. Understanding of ETL frameworks and data quality controls
11. Background working in regulated financial services environments
Soft Skills:
12. Ability to work effectively in agile delivery environments
13. Strong stakeholder engagement andmunication skills
14. Ability to coach, influence and support business data analysts
15. A collaborative, delivery‑focused mindset
What you'll get in return
16. Opportunity to work on a high‑impact regulatory programme
17. Long‑term contract potential within a mature data environment
18. Hybrid working model with flexibility (minimum 2 days on-site in Milton Keynes)
19. A role where Databricks expertise is genuinely valued and central to delivery
20. Exposure to senior stakeholders and meaningful business oues