Salary: £40,000 - 65,000 per year Requirements: Proficiency in data pipeline orchestration tools (e.g., Airflow, Prefect, Dagster) Extensive experience with Docker and Kubernetes Proficiency in CI/CD principles and tools Familiarity with open-source data tools (e.g., Spark, Kafka, PostgreSQL) Competency understanding of IaC concepts (e.g., Terraform, Ansible) Understanding of data architecture principles Experience with monitoring and observability tools like Grafana and Prometheus Responsibilities: Designing, building, automating and orchestrating data pipelines using tools such as Airflow, Prefect, or Dagster Containerising data applications using Docker and deploying them to Container Platforms (EKS, AKS and Kubernetes) Implementing and managing CI/CD pipelines for data applications Implementing and managing comprehensive monitoring and observability solutions using tools like Grafana and Prometheus, ensuring data quality across the entire data flow Working with Infrastructure as Code (IaC) tools (e.g., Terraform, Ansible) to provision and manage data infrastructure within pre-existing platforms Optimising data processing for performance and scalability Technologies: Airflow AWS Ansible Azure CI/CD Cloud Docker Flow GCP Grafana Kafka Kubernetes PostgreSQL Prometheus Spark Terraform AI DevOps Security More: At Capgemini, we are part of the Cloud Data Platforms team within the Insights and Data Global Practice, dedicated to driving our customers digital and data transformation journeys using modern cloud platforms such as AWS, Azure, and GCP. Our hybrid working model allows flexibility with a blend of office, client site, and home working. We prioritize equity, diversity, and inclusion, creating an inclusive environment where everyone can thrive. Join us to be part of a dynamic team and embrace opportunities for continuous learning and innovation. last updated 5 week of 2026