London, United Kingdom | Posted on 20/04/2026
We are seeking an experienced Platform Engineer with strong expertise in Databricks and PySpark to join our team on a contract basis. This role is focused on platform optimisation, performance, and cost efficiency, ensuring our Databricks environment runs at its optimum level.
Key Responsibilities:
* Manage and optimise the Databricks platform to ensure high performance and cost efficiency
* Analyse DBU consumption and drive improvements in platform utilisation
* Monitor and enhance PySpark pipeline performance
* Make informed decisions on scaling up/down clusters based on workload demands
* Establish and maintain automated release management processes
* Focus on operational excellence across the platform environment
* Gather performance data and provide clear evidence-based insights to stakeholders and Databricks teams
Key Skills & Experience:
* Strong hands‑on experience with Databricks platform engineering
* Solid expertise in PySpark and performance tuning
* Proven experience optimising data pipelines and platform costs
* Experience with cluster management and scaling strategies
* Ability to analyse performance metrics and present findings effectively
* Familiarity with platform operations and reliability best practices
Additional Information:
* Industry experience in Trading & Supply is not required
* This is an optimisation‑focused role, rather than development‑heavy
If you’re passionate about performance optimisation, cost efficiency, and platform engineering within Databricks, we’d love to hear from you.
#J-18808-Ljbffr