My client is seeking a Senior Data Engineer to play a pivotal role in designing, scaling, and maintaining data pipelines and Lakehouse architectures. You will collaborate closely with internal stakeholders and external partners to optimise existing systems, with a particular focus on enhancing fan engagement through digital platforms.
Key Responsibilities
* Design and develop ETL/ELT pipelines in Azure and Databricks, ensuring reliability and performance.
* Construct Kimball-style dimensional models to support analytics and reporting.
* Implement automated testing for data quality assurance and validation.
* Ensure compliance with data governance, legal, and regulatory standards.
* Collaborate with the wider Data team to optimise pipelines and enhance platform capabilities.
Essential Skills & Experience
* Hands-on expertise with Databricks, PySpark, and Delta Lake.
* Proven ability to build production-grade ETL/ELT pipelines, including integration with SFTP and REST APIs.
* Strong knowledge of Kimball methodology within Lakehouse frameworks.
* Advanced proficiency in Azure data services (ADF, ADLS Gen2, Event Hubs) and SQL.
* Excellent analytical and troubleshooting skills for data transformation and quality assurance.
* Strong communication skills to translate technical concepts for stakeholders and collaborate across teams.
Outside IR35, with occassional travel to client site required.