Salary: £54,000 - 66,000 per year Requirements: Strong Data Engineering expertise within AWS environments Hands-on experience with core AWS data services: S3, Glue, Lambda, Athena, Kinesis, Step Functions (or similar) Proficiency in Python and SQL for data transformations and automation Experience with IaC and CI/CD tooling (Terraform, GitLab, etc.) Comfortable working with sensitive datasets and secure-by-design approaches Strong communication skills and a proactive, consulting mindset Experience delivering as part of an Agile, multi-disciplinary team Desirable: Knowledge of backend processing for analytics workloads (but not essential) Familiarity with containerised deployments (Docker/ECS) Experience working to SFIA-aligned delivery expectations in government or regulated contexts Responsibilities: Design, develop and maintain scalable cloud-native data pipelines Implement ETL/ELT processes to manage structured and unstructured data securely and efficiently Ensure data integrity, traceability and compliance across all pipeline stages Work with cross-functional teams to define technical requirements and design decisions Apply DevOps best practices, monitoring, and automation to improve reliability Support continuous improvement of the platforms performance and operational maturity Communicate progress, risks and trade-offs clearly to wider delivery stakeholders Technologies: AWS Lambda Backend CI/CD Cloud DevOps Docker ETL GitLab Support Python SQL Terraform More: We are supporting a major government data transformation initiative to strengthen the use of evidence-based insights across frontline and operational teams. As part of a new capability being built to process and analyse sensitive interview information, we require Data Engineers to design, deliver, and optimise secure backend data workflows. This foundational work involves building the ingestion, orchestration, storage, and transformation layers that power the analytics tool. last updated 5 week of 2026