Role: Senior Data Engineer (Databricks / AWS / Lakehouse)
Making sure you fit the guidelines as an applicant for this role is essential, please read the below carefully.
Location: Cambridge, UK (Flexible Hybrid Working)
Salary: £70,000 - £90,000 basic + Comprehensive Benefits Package
Are you a Data Engineer who wants to build systems that truly matter? Are you an expert in Databricks, looking for a challenge beyond just operating an existing platform?
Im hiring for a pioneering, mission-driven company in the medical technology sector. They are fundamentally changing the future of surgery by developing next-generation robotic systems to make minimally invasive procedures more accessible and effective.
This is not a maintenance role. We are looking for a Databricks expert to be the key technical authority for a business-critical transformation. Your mission will be to architect, build, and lead the company on its journey to a modern Databricks Lakehouse architecture, transforming how they leverage data from robotics, manufacturing, and R&D.
What Youll Actually Do:
* Architect and lead the greenfield design and implementation of a scalable, company-wide Databricks Lakehouse platform on AWS.
* Be the hands-on technical expert, building and optimising robust ELT/ETL pipelines using Python, Spark, and Databricks (e.g., Delta Live Tables, Databricks Workflows).
* Work with unique, complex, and high-volume datasets from IoT-enabled robotic systems, manufacturing lines, and core business functions.
* Partner with data scientists and BI teams to establish best-in-class data models, governance, and data quality standards within Delta Lake.
* Evangelise the benefits of the Lakehouse across the organisation, championing best practices and mentoring other engineers to build their Databricks capability.
* Own the data platforms roadmap, ensuring it is scalable, reliable, and secure as the company grows.
What Youll Need:
* Proven, deep commercial experience with Databricks. You must have hands-on expertise with Delta Lake and the Lakehouse paradigm.
* Strong expertise in the AWS data ecosystem (e.g., S3, AWS Glue, Kinesis, IAM) and a deep understanding of how to build, secure, and optimise a Databricks platform within it.
* Expert-level Python and SQL skills, specifically for data engineering and optimisation.
* A "builder" and "leader" mindset: You have experience taking an organisation on a data transformation journey, not just maintaining an existing system.
* A strong understanding of data modelling principles (e.g., Kimball, Inmon) and how to apply them within a Lakehouse architecture.
* A collaborative and proactive approach, with the ability to communicate complex technical concepts to non-technical stakeholders.
Whats In It For You?
Whats in it for you is a truly career-defining project. This is a greenfield opportunity to own a critical transformation, establishing you as the go-to expert building a platform from the ground up and gaining incredible strategic experience. This work supports a tangible mission, offering a rare chance to apply your data skills to a product that is transforming surgery and improving patient outcomes globally.
You will tackle unique technical challenges, solving complex data problems at scale that you wont find anywhere else by working with real-world IoT and robotic data. Youll do this alongside a world-class team of passionate, intelligent, and humble engineers, data scientists, and robotics experts. The company supports this with a genuine commitment to work-life balance, including flexible and hybrid working, all backed by a comprehensive salary and benefits package designed to attract top talent (details available on application).
Interested?
If you are the Databricks and AWS expert were looking for, apply now to find out more about this unique opportunity.