Responsibilities
* Lead the design and development of robust data pipelines and data platforms to support analytics and AI workloads
* Build and maintain batch and streaming data solutions, including ingestion, transformation, and orchestration
* Design appropriate data models using relational, dimensional, or other modelling techniques, selecting approaches based on the requirements of analytics, reporting, and data science use
* Collaborate across multiple teams and client stakeholders to design and deliver end‑to‑end data and AI solutions within the wider client technology landscape
* Work across cloud‑based data platforms and services (e.g. AWS, Azure, or GCP)
* Design and deliver data solutions that are functionally correct and fit for purpose, while meeting non‑functional requirements such as security, performance, resilience, and maintainability
* Support and guide junior engineers through technical leadership and mentoring
* Contribute to platform and engineering standards, best practices, and continuous improvement across engagements
Required Qualifications
* Bachelor's Degree in a related field
* Strong experience with Python, SQL, and data engineering frameworks or tools
* Proven experience designing and delivering data pipelines and data platforms in a professional environment
* Hands on experience with cloud data services (e.g. Azure Data Factory, Databricks, Synapse, AWS Glue, Redshift, BigQuery, or equivalents)
* Experience working with relational and/or NoSQL databases
* Solid understanding of data modelling, integration patterns, and data quality principles
* Strong problem‑solving and communication skills, with experience working in multidisciplinary delivery teams
Preferred Qualifications
* Familiarity with data pipeline orchestration tools (e.g., Apache Airflow, Luigi)
* Knowledge of data quality and metadata management practices
* Understanding of data virtualisation and data federation techniques
* Experience with big data technologies (e.g., Hadoop, Spark)
* Experience supporting analytics, data science, or machine learning use cases
* Exposure to streaming or event‑driven architectures (e.g. Kafka or equivalent)
* Experience with CI/CD, infrastructure as code, or platform automation
* Background delivering solutions in regulated or security‑constrained environments such as public sector or financial services
This role is subject to pre‑employment screening in line with the UK Government's Baseline Personnel Security Standard (BPSS). An additional range of Personal Security Controls referred to as National Security Vetting (NVS) may apply, which could include meeting the eligibility requirements for The Security Check (SC) or Developed Vetting (DV).
#J-18808-Ljbffr