Our client are on an exciting journey to build a modern, scalable cloud data platform that powers insight, innovation and future AI capabilities across the Group. We’re looking for a talented Data Engineer to help shape and deliver that vision.
As a key member of our Group Data Team, you’ll work hand‑in‑hand with technical and business stakeholders to design, build and optimise data products that drive real business value. If you thrive on solving complex data challenges and want to play a pivotal role in a growing data function, we want to hear from you.
* Partnering with cloud infrastructure and development teams to deliver robust solutions.
* Working within an established SDLC to ensure quality, consistency and control.
* Collaborating with analysts and system owners to define data extraction and transformation rules, including interface contracts.
* Supporting data modelling initiatives and maintaining transformation logic.
* Building ETL/ELT batch and micro‑batch pipelines.
* Developing modelled data sources and products in Snowflake for end‑users.
* Maintaining metadata, enforcing data quality and defining platform standards.
* Administering the cloud data platform.
* Designing and implementing real‑time streaming pipelines using AWS Kinesis, Firehose, and Kafka (MSK).
Skills required:
* Hands‑on experience delivering ETL/ELT pipelines using tools like AWS Glue or FiveTran.
* Experience with transformation tooling such as dbt.
* Strong proficiency in SQL and Python.
* Deep understanding of Snowflake and modern data platform principles.
* Knowledge of cloud‑based data architectures (lakes, Lakehouses, warehouses).
* Familiarity with AWS services including S3, DynamoDB, Aurora, RDS, Glue, Athena and EMR.
* Understanding of Lakehouse storage formats (Parquet, Delta, Iceberg).
* Working knowledge of data modelling methodologies (Inmon, Kimball).
* Excellent communication skills and the ability to tailor messages for technical and non‑technical audiences.
* Strong documentation skills, including LLDS and runbooks.
* Streaming technologies (Kinesis, Firehose, Kafka, Flink).
* Infrastructure‑as‑Code tools such as CloudFormation or Terraform.
Candidates must be based within a commutable distance to Harrogate as in office days will be required.
#J-18808-Ljbffr