Duration: initial contract until end of March 26, with possible extension
Daily rate : £495 inside IR35
Location: Remote
Clearance: active SC
Key Responsibilities:
* Design, develop, and optimize data pipelines using Microsoft Fabric and Azure services (eg, Azure Data Factory, Azure Synapse Analytics, Azure Databricks).
* Build and maintain scalable, high-performance data architectures to support analytics, reporting, and machine learning workloads.
* Implement data ingestion, ETL/ELT processes, and data warehousing solutions.
* Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver solutions.
* Ensure data quality, security, and compliance with organizational and regulatory standards.
* Monitor and troubleshoot data pipelines, optimizing for performance and cost-efficiency.
Experience:
* Strong hands-on experience with Microsoft Fabric.
* Proficient in Python for data engineering (eg, Pandas, PySpark, asyncio, automation scripts).
* Strong SQL skills (T-SQL or similar) for transforming and modeling data.
* Experience building scalable ETL/ELT pipelines using cloud technologies.
* Good understanding of data modeling (star/snowflake), data warehousing, and modern data lakehouse principles.
* Familiarity with version control (Git) and CI/CD pipelines.
* Experience working in Agile/Scrum environments