Salary: £95,000 - 95,000 per year Requirements: Proven experience delivering production-ready data solutions on Google Cloud Platform Strong knowledge of batch and streaming frameworks, data pipelines, and orchestration tools Expertise in designing and managing structured and unstructured data systems Experience translating business needs into technical solutions Ability to mentor and coach teams and guide technical decision-making Excellent communication skills, with the ability to explain technical concepts to technical and non-technical stakeholders A pragmatic approach to problem solving, combined with a drive for technical excellence Responsibilities: Lead the design, development, and delivery of data processing solutions using GCP tools such as Dataflow, Dataproc, and BigQuery Design automated data pipelines using orchestration tools like Cloud Composer Contribute to architecture discussions and design end-to-end data solutions Own development processes for your team, establishing robust principles and methods across architecture, code quality, and deployments Shape team behaviours around specifications, acceptance criteria, sprint planning, and documentation Define and evolve data engineering standards and practices across the organisation Lead technical discussions with client stakeholders, achieving buy-in for solutions Mentor and coach team members, building technical expertise and capability Develop production-ready data pipelines and processing jobs using batch and streaming frameworks such as Apache Spark and Apache Beam Apply expertise in data storage technologies including relational, columnar, document, NoSQL, data warehouses, and data lakes Implement modern data pipeline patterns, event-driven architectures, ETL/ELT processes, and stream processing solutions Translate business requirements into technical specifications and actionable solution designs Work with metadata management and data governance tools such as Cloud Data Catalog, Collibra, or Dataplex Build data quality alerting and data quarantine solutions to ensure downstream reliability Implement CI/CD pipelines with version control, automated tests, and automated deployments Collaborate in Agile teams, using Scrum or Kanban methodologies Technologies: BigQuery CI/CD Cloud Composer ETL GCP Kanban NoSQL Spark More: We are an award-winning innovation and transformation consultancy known for our cutting-edge work in data engineering, cloud solutions, and enterprise transformation. With a culture that fosters the growth of technical specialists, we empower our team to turn complexity into opportunity. We are looking for a Principal GCP Data Engineer to join our data and analytics practice, leading the design and delivery of end-to-end data solutions on Google Cloud Platform. This role offers the chance to shape data strategy and drive technical excellence across complex programmes while enjoying a collaborative, inclusive, and learning-focused culture. last updated 7 week of 2026