Salary: £60,000 - 100,000 per year Requirements: Experience as a Data Engineer / Big Data Engineer Strong coding skills in Python, Scala, or Java Hands-on experience with Spark, Kafka, ETL / data pipelines Knowledge of cloud platforms (AWS, Azure, or GCP) Familiarity with Agile and software engineering best practices Strong communication and stakeholder engagement skills Responsibilities: Build and optimise data pipelines, ETL processes, and data platforms Develop solutions using Python, Spark, Kafka, Hadoop, or similar technologies Work in Agile teams to deliver production-ready systems Translate business requirements into technical data solutions Engage with stakeholders and communicate technical concepts clearly Deploy solutions using cloud services (AWS/Azure/GCP), Docker, Kubernetes, CI/CD Technologies: AWS Azure Big Data Cloud Docker ETL GCP Hadoop Java Kafka Kubernetes Python Scala Spark CI/CD DevOps More: We are hiring a Data Engineer / Consultant Data Engineer to design and deliver scalable data pipelines and big data solutions for clients. This is a client-facing role that combines hands-on engineering with stakeholder engagement and solution design. We offer a collaborative team environment that values efficiency and innovation, and we are located in a vibrant area that encourages professional development. last updated 13 week of 2026