Salary: £70,000 - 85,000 per year Requirements:
* Commercial and proven Lead Data Engineering experience.
* Hands-on experience delivering solutions on Azure + Databricks.
* Strong PySpark and Spark SQL skills within distributed compute environments.
* Experience working in a Lakehouse/Medallion architecture with Delta Lake.
* Understanding of dimensional modelling (Kimball), including SCD Type 1/2.
* Exposure to operational concepts such as monitoring, retries, idempotency and backfills.
* Good energy and enthusiasm.
* Keen to grow within a modern Azure Data Platform environment.
* Comfortable with Git, CI/CD and modern engineering workflows.
* Able to communicate technical concepts clearly to non-technical stakeholders.
* Quality-driven, collaborative and proactive.
Responsibilities:
* Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL.
* Work within a Medallion architecture (Bronze - Silver - Gold) to deliver reliable, high-quality datasets.
* Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns.
* Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations.
* Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas.
* Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets.
* Apply governance, lineage and permissioning through Unity Catalog.
* Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs.
* Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability.
* Assist in performance tuning and cost optimisation.
* Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets.
* Support secure deployment patterns using private endpoints, managed identities and Key Vault.
* Participate in code reviews and help improve engineering practices.
* Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business.
* Contribute to architectural discussions and the ongoing data platform roadmap.
Technologies:
* Azure
* CI/CD
* Databricks
* DevOps
* Git
* GraphQL
* Support
* Power BI
* PySpark
* REST
* SQL
* SharePoint
* Spark
* Unity
* Cloud
* GameDev
* Fabric
* Python
More:
We are a leading FMCG company based near Glasgow, undergoing a major transformation of our data landscape. We are building a modern Azure + Databricks Lakehouse platform, empowering our teams across Finance, Operations, Sales, Customer Care, and Logistics. Our team is focused on delivering high-quality solutions and fostering real progression opportunities within a growing data function. We offer a dynamic work environment that emphasizes collaboration, quality, and innovation, with the ability to directly impact multiple business domains.
last updated 7 week of 2026