Job Description
Data Engineer
Long Stratton, Norwich, Norfolk
£56,000 per annum
Full Time: 37hrs per week
Saffron is looking for a talented Data Engineer to help drive the next stage of our data transformation. This role is all about building and optimising our Azure-based data platform, developing high-performing pipelines in Azure Data Factory, and supporting our move toward Microsoft Fabric. You will work closely with BI Analysts and teams across the business to deliver reliable, high-quality data that powers smarter decisions and sharper insights. It is a chance to shape a modern, scalable data environment and make a real impact on how we use data across the organisation.
Key Responsibilities:
* Design, build, and maintain a scalable Azure-based data warehouse that meets the current and future requirements of the Data & Analytics team.
* Lead the introduction, adoption, and optimisation of Microsoft Fabric (e.g., Lakehouse, Warehouse, Data Engineering, Pipelines).
* Apply CI/CD practices (e.g., Azure DevOps) for version control, deployment automation, and environment management.
* Implement data quality checks, pipeline observability, alerting, and automated monitoring to ensure consistent platform reliability.
* Work collaboratively with data owners and the wider data team to ensure data definitions, lineage, and ownership are clearly established.
* Work collaboratively with data owners and the wider data team to ensure data definitions, lineage, and ownership are clearly established.
* Provide technical guidance and coaching to the wider data team members on data engineering best practices.
For a full list of responsibilities please see the attached Role Profile
Our Ideal Candidate Will Have:
Education and Qualifications:
* Degree in Computer Science, Data Engineering, Mathematics, or a related discipline, or equivalent experience (E)
* Microsoft certifications in SQL, Fabric, including Power BI, or other Azure Data Services (D)
Experience:
* Advanced SQL skills, including optimisation of complex queries (E).
* Experience building data pipelines and ETL/ELT workflows using tools such as:
* Azure Data Factory, Databricks, Airflow, Luigi, or similar (E)
* Strong understanding of data modelling (E)
* Programming skills in Python and/or Scala for data processing (D).
* Experience with machine learning pipelines or MLOps frameworks (D).
Personal Attributes:
* Confident communicator able to engage both technical and non-technical audiences.
* Proactive, innovative, and committed to continuous improvement.
* Collaborative, with mentoring and leadership capabilities.
* Customer-focused, with a commitment to improving services through data.
* Experience working in a busy, fast-paced workload, and managing multiple projects to meet deadlines.