Salary: £80,000 - 85,000 per year Requirements: Strong experience designing and operating scalable ETL/ELT pipelines. Hands-on Microsoft Fabric experience (Dataflows Gen2, Notebooks, semantic models). SQL and Python proficiency, with Spark/Spark SQL exposure. Practical understanding of data quality, observability and troubleshooting. Ability to explain technical concepts clearly and collaborate across teams. CI/CD experience (desirable). Experience in high-volume or real-time data environments (desirable). Familiarity with data cataloguing tools (e.g. Purview) (desirable). Knowledge of data mesh, AI/ML, or sustainability-focused data practices (desirable). Responsibilities: Design, build and maintain ETL and ELT pipelines, lakehouse structures and semantic models using Microsoft Fabric. Build end-to-end data pipelines and Fabric-based lakehouse solutions. Create semantic layers using star schema modelling and DAX. Embed monitoring, lineage and data quality into pipelines. Integrate data from APIs, CRM/ERP systems and other third-party sources. Ensure secure, compliant data handling aligned with GDPR and ISO 27001. Support CI/CD deployment of version-controlled artifacts. Collaborate closely with data architects, analysts and domain teams to support decentralized data products. Technologies: AI CI/CD CRM DAX ETL ERP Fabric Support Python SQL Spark Cloud More: At Exalto Consulting, we are supporting a major organization undergoing significant data and digital transformation. We are looking for a Senior Data Engineer with strong experience in Microsoft Fabric to help shape and deliver reliable, scalable data products. This role offers you the chance to work in a modern Microsoft Fabric environment, with real investment from our organization, and the opportunity to influence data engineering standards. We value a collaborative culture that promotes continuous improvement and learning, and we support ongoing development, including certification opportunities. last updated 5 week of 2026