Job Description
These roles will require DV Clearance eventually, therefore I’m afraid we can only accept applications from UK Nationals and ideally those with SC Clearance.
We will look at Candidates of ALL levels here as there are potentially many opportunities.
The Client:
Our clients are a leading Systems & Engineering Consultancy with some large scale Government & Defence Contracts. They’re known for looking after their staff and providing an excellent working environment with sound prospects, training and development and a hybrid working environment – for this role you’ll need to be flexible to be in Cambridge a few days a week, potentially other client sites and the rest of the time wfh. The Benefits are fantastic and available on request.
The Role:
In this role, you’ll design and build data pipelines that enable faster decision-making, smarter operations, and better outcomes—often in real-time and at scale. Working alongside analysts, engineers, and mission specialists, you'll help connect the dots between data and national resilience.
As a Data Engineer you will add value to the business through helping develop capabilities, securing new business opportunities or adding to our trusting and flexible culture. You will:
* Develop and maintain robust, secure data pipelines across varied defence systems
* Integrate, transform, and enrich large-scale datasets—including structured, unstructured, and sensor data
* Collaborate with platform and cloud teams to ensure data solutions are scalable, secure, and dependable
* Support users across intelligence, logistics, and operational teams to unlock the full value of data
* Occasionally handle geospatial datasets, such as satellite imagery, mapping feeds, or location-tagged data
* Contribute to the design of modern data architectures in line with MOD standards
Requirements:
* Demonstrable experience in data engineering or ETL development using tools like Spark, Python, or SQL
* Familiarity with cloud platforms (AWS, Azure, or MODCloud) and data services
* Experience building and automating data pipelines using tools like Airflow, dbt, or similar
* An understanding of data security and privacy in sensitive environments
* Ideally exposure to geospatial tools or libraries such as PostGIS, GDAL, QGIS, or geospatial APIs
* Experience with defence-related datasets, sensor feeds, or mission systems
* Familiarity with containerised environments (Docker/Kubernetes) and DevOps practices
* Knowledge of MOD data standards, JSPs, or NIST frameworks is desirable
Process:
The interview process will be 2 / 3 x fold with Technical & Competency-based interviews conducted mainly across Teams. We’re looking for someone to join as soon as possible, though we’re happy to wait for the right person.
Keywords:
Data Engineer, ETL, AWS, Azure, GCP, Cloud, Python, SQL, Spark, Engineer, Data Engineer