As an adopter of Fabric we are looking for an ambitious data engineer to grow with the our use of the platform. This is an opportunity to build something that truly matters. Your role will transform how we deliver our care, housing, and community services, through the creation of a modern data platform from the ground up.
This isn’t just engineering - It’s a chance to make a real difference, sharpen your cloud skills, and leave a lasting mark on an organisation that improves lives every day.
As an adopter of Fabric we are looking for an ambitious engineer to grow with the our use of the platform.
If you are ready to be part of something more, then apply today!
About the role
This exciting role, within our newly created BI Team, will be hand on as you will be the sole data engineer. You will be hands on creating scalable and automated data pipelines with tools like Microsoft Fabric, Azure Data Factory, SQL, Python, and Spark. You will also:
* Design and build repeatable ingestion (APIs, databases, flat files) with incremental and historical loads.
* Implement resilient ELT/ETL pipelines with parameterization, orchestration, retry/alerting, and logging.
* Create snapshot and slowly changing dimension (SCD) patterns for month end and trend analysis.
* Optimise performance (partitioning, indexing, caching) and manage cost-efficient refresh cadences (daily/weekly/Realtime where appropriate).
* Develop cleaned/curated layers (e.g., Bronze/Silver/Gold or trusted data marts) and star schema models aligned to business definitions.
* Partner with BI developers to ensure visuals are fed by reusable, governed datasets.
* Embed data quality rules (validity, timeliness, completeness), reconciliation against source systems, and issue backlogs.
* Monitor pipeline health and cost; manage incident response and root cause analysis.
* Translate business requirements into technical designs and estimates; run technical workshops and design walkthroughs.
* Produce clear documentation (runbooks, diagrams, standards).
About you
A positive self-starter, you will have advanced SQL skills and experience of using Azure, or similar, to build data pipelines. You will also need to have experience configuring APIs, and producing dimensional data modelling (star/snowflake), semantic modelling for BI.
You will have a solid understanding of data security, PII, GDPR and data related compliance and governance. Alongside this, you will have a track record of delivering within time constraints with a continuous improvement mindset.
It’s not essential but it would be great if you also have experience with:
* Social housing and care data/systems
* Hands on experience of Fabric
* Spark, Python or other commonly used languages for wrangling data
You will need to be ambitious, self starting and be able to adapt to and adopt new technology as it comes along.
Job Benefits
* Flexible Working - The opportunity to work from home up to 3 days a week
* Equivalent to 25 days of paid annual leave (in addition to bank holidays), increasing to the equivalent of 28 after 5 years’ service (pro-rata)
* Pension Scheme
* Annual achievement review with the opportunity for pay progression
* Blue Light Card discount service, offering online and high street discounts
* Care First Employee Assistance Programme (provides a range of free, confidential services) and in-house Mental Health First Aiders available
* Colleague Voice Representatives, enabling you to have your say
* Cycle to Work Scheme
* Company Sick Pay – Linked to length of service
* £200 refer a friend bonus
* Plus the below ….