Contract Length: Initial 6 months (extension likely)
Day Rate: £500 per day (Outside IR35)Location: Hybrid in South Wales
Role OverviewWe are seeking an experienced Senior Data Engineer contractor to support and extend the Azure Databricks Lakehouse platform, ensuring the reliable delivery of new data pipelines alongside ongoing operational stability.This is a hands‑on engineering role focused on implementation, optimisation and support of Delta Lake pipelines across Bronze, Silver, Staging and Gold layers, operating within an established architectural framework.
Key Responsibilities
1. Develop and maintain data pipelines in Azure Databricks using PySpark and SQL
2. Build incremental processing using Delta Lake and Change Data Feed
3. Extend and maintain watermark‑driven CDC frameworks
4. Implement and maintain Slowly Changing Dimension (SCD Type 2) logic
5. Engineer time‑aware joins and dimensional transformations
6. Support optimisation of Spark workloads and Delta tables
7. Solid understanding of dimensional modelling and star schema
Medallion Architecture Delivery
8. Implement ingestion patterns into Bronze
9. Build Silver state, CDC and SCD2 tables
10. Develop Gold dimensional models aligned to Kimball principles.
11. Maintain clean separation of Bronze, Silver and Gold usage and responsibilities
Delta Lake & Performance Optimisation
12. Work extensively with Delta Lake features including:ACID transactionsMerge‑based upsertsPartitioning and data layout strategies
13. Tune Spark jobs and clusters to improve performance and cost efficiency