Salary: £60,000 - 65,000 per year Requirements: Strong experience designing and building data pipelines on cloud platforms, particularly AWS. Excellent proficiency in developing ETL processes and data transformation workflows. Strong SQL skills (PostgreSQL) and advanced Python coding capability (essential). Experience working with AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB, and Lambda (essential). Understanding of Terraform codebases to create and manage AWS infrastructure. Experience developing, optimising, and maintaining data pipelines using Apache Airflow. Familiarity with distributed data processing systems such as Spark or Databricks. Experience working with high-performing, low-latency, or large-volume data systems. Ability to collaborate effectively within cross-functional, agile, delivery-focused teams. Experience defining data models, metadata, and data dictionaries to ensure consistency and accuracy. A degree or equivalent qualification in Computer Science, Data Science, or a related discipline (desirable). Willing and eligible to achieve a minimum of SC clearance, with a minimum of 5 years residency in the UK and the right to work in the UK. Responsibilities: Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS. Create and manage data models that support efficient storage, retrieval, and analysis of data. Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB, and Lambda to architect and maintain cloud data solutions. Maintain modular Terraform-based Infrastructure as Code (IaC) for reliable provisioning of AWS infrastructure. Develop, optimise, and maintain robust data pipelines using Apache Airflow. Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use. Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements. Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability. Identify data quality issues and implement data validation and cleansing processes. Maintain clear and comprehensive documentation covering data pipelines, models, and best practices. Work within a continuous integration environment with automated builds, deployments, and testing. Technologies: Airflow AWS Lambda Redshift Architect Cloud Databricks EC2 ETL Support PLC PostgreSQL Python SQL Spark Terraform More: At Triad Group Plc, we are an award-winning digital, data, and solutions consultancy with over 35 years of experience, primarily serving the UK public sector and central government. We prioritize collaboration and innovation, creating a supportive culture that values every voice. Our team utilizes cutting-edge technology to deliver high-quality, impactful solutions. We offer a competitive salary of up to £65k, along with 25 days of annual leave, matched pension contributions, private healthcare, gym membership support, and more. Join us in our Godalming or Milton Keynes offices, or work remotely in a role that empowers you to shape data architecture and influence meaningful outcomes. last updated 6 week of 2026