Data Engineer - Databricks / AI Enjoy designing elegant data systems, shipping production-grade code, and seeing your work make a measurable difference, then this role is for you Our team build data platforms and AI solutions that power critical infrastructure, transform operations, and move entire industries. In this role you will work with a broad set of clients on high-scale problems, with the backing of a global organisation, investing heavily in Azure, Databricks, and applied AI. Working primarily on Azure Databricks (Spark, Delta Lake, Unity Catalog) - Ship modern ELT/ETL, streaming and batch data products, and ML/AI pipelines - Operating at serious scale across water, transport, energy, and more - Join a collaborative, engineering-led culture with real investment in platforms & tooling. Utilising stack such as; * Azure: ADLS Gen2, Event Hubs, ADF/Azure Data Factory or Synapse pipelines, Functions, Key Vault, VNets * Databricks: Spark, Delta Lake, Unity Catalog, Workflows, MLflow (experiments, model registry) * Languages: Python (PySpark), SQL (Delta SQL), optional Scala * Engineering: Git, pull requests, code review, unit/integration tests, dbx, notebooks as code * Platform & Ops: Azure DevOps/GitHub, CI/CD, Terraform or Bicep, monitoring/alerting Your remit and responsibilites will include; * Design & build robust data platforms and pipelines on Azure and Databricks (batch streaming) using Python/SQL, Spark, Delta Lake, and Data Lakehouse patterns. * Develop AI-enabling foundations: feature stores, ML-ready datasets, and automated model-serving pathways (MLflow, model registries, CI/CD). * Own quality & reliability: testing (dbx/pytest), observability (metrics, logging, lineage), and cost/performance optimisation. * Harden for enterprise: security-by-design, access patterns with Unity Catalog, data governance, and reproducible environments. * Automate the boring stuff: IaC (Terraform/Bicep), CI/CD (Azure DevOps/GitHub Actions), and templated project scaffolding. * Partner with clients: translate business problems into technical plans, run workshops, and present trade-offs with clarity. * Ship value continuously: iterate, review, and release frequently; measure outcomes, not just outputs. Our team would be delighted to hear from candidates with a good mix of * Utilising SQL and Python for building reliable data pipelines. * Hands-on with Spark (preferably Databricks) and modern data modelling (e.g., Kimball/Inmon/Data Vault, lakehouse). - * Experience running on a cloud data platform (ideally Azure). - Sound software delivery practices: Git, CI/CD, testing, Agile ways of working. * Streaming/event-driven designs (Event Hubs, Kafka, Structured Streaming). * MPP/Data Warehouses (Synapse, Snowflake, Redshift) and NoSQL (Cosmos DB). * ML enablement: feature engineering at scale, MLflow, basic model lifecycle know‑how. - Infrastructure-as-code (Terraform/Bicep) and platform hardening. Don't meet every single bullet? We'd still love to hear from you. We hire for mindset and potential as much as current skills LINM1 https://careers.jacobs.com/en_US/careers/JobDetail/32020 London|Greater London|United Kingdom Glasgow|Lanarkshire|United Kingdom Manchester|Greater Manchester|United Kingdom Digital and Data Enterprise Functions Joining Jacobs not only connects you locally but globally. Our values stand on a foundation of safety, integrity, inclusion and belonging. We put people at the heart of our business, and we truly believe that by supporting one another through our culture of caring, we all succeed. We value positive mental health and a sense of belonging for all employees. With safety and flexibility always top of mind, we’ve gone beyond traditional ways of working so you have the support, means and space to maximize your potential. You’ll uncover flexible working arrangements, benefits, and opportunities, from well-being benefits to our global giving and volunteering program, to exploring new and inventive ways to help our clients make the world a better place. No matter what drives you, you’ll discover how you can cultivate, nurture, and achieve your goals – all at a single global company. Find out more about life at Jacobs. We aim to embed inclusion and belonging in everything we do. We know that if we are inclusive, we’re more connected and creative. We accept people for who they are, and we champion the richness of different perspectives, lived experiences and backgrounds in the workplace, as a source of learning and innovation. We are committed to building vibrant communities within Jacobs, including through our Jacobs Employee Networks, Communities of Practice and our Find Your Community initiatives, allowing every employee to find connection, purpose, and belonging. Find out more about our Jacobs Employee Networks here. Jacobs partners with VERCIDA to help us attract and retain talent from a wide range of backgrounds. For greater online accessibility please visit www.vercida.com/uk/employers/jacobs to view and access our roles. As a disability confident employer, we will interview all disabled applicants who meet the minimum criteria for a vacancy. We welcome applications from candidates who are seeking flexible working and from those who may not meet all the listed requirements for a role. We value collaboration and believe that in-person interactions are crucial for both our culture and client delivery. We empower employees with our hybrid working policy, allowing them to split their work week between Jacobs offices/projects and remote locations enabling them to deliver their best work. Your application experience is important to us, and we’re keen to adapt to make every interaction even better. If you require further support or reasonable adjustments with regards to the recruitment process (for example, you require the application form in a different format), please contact the team via Careers Support .