Senior Data Ops Engineer
Based: Leeds - hybrid
Salary: up to £62,000
I'm partnered with an established FS company who are scaling their cloud-native data platform and building a modern, centralised data function to better support a wide community of Data Scientists, Analysts, and federated Data Engineers across the organisation.
They are looking for a Senior DataOps Engineer to help shape how data pipelines are run, monitored, governed, and optimised at scale.
This is an opportunity to join a growing team working at the heart of the organisation’s data transformation, improving platform efficiency, enabling self-service, and ensuring data pipelines operate with the same discipline as production software.
The Role
As a Senior DataOps Engineer, you’ll take a strategic, high-level view of the data platform while still diving deep when needed. You will focus on observability, automation, pipeline performance, operational excellence, and cloud cost optimisation.
You’ll work cross-functionally with Data Engineering, DevOps, and FinOps teams, helping ensure that data services are reliable, scalable, secure and cost-effective, and that federated teams across the organisation can self-serve with confidence.
What You’ll Be Doing
* Taking an overview of how pipelines run across the platform, improving performance and throughput
* Enhancing observability and monitoring across Azure-based data workloads
* Identifying bottlenecks and opportunities to streamline operational processes
* Using scheduling/orchestration tools to optimise workflows and improve run times
* Treating data pipelines like production-grade software with robust monitoring, automation, and scalability in mind
* Supporting incident management and helping federated teams resolve issues efficiently
* Driving efficiency through automation and reduction of manual operational overhead
* Working with FinOps practices to optimise spend and evaluate cost-performance trade-offs
* Advocating for better platform usage, adoption, and operational best practices
Tech Stack & Skills
Essential
* Strong experience working with Azure cloud platform
* Background in data engineering and building/maintaining data pipelines
* Experience with pipeline monitoring, observability, and incident troubleshooting
* Strong automation mindset and ability to build resilient, self-healing data workflows
Nice to Have
* Knowledge of FinOps principles and cloud cost optimisation
* Experience with orchestration tools such as Azure Data Factory, Databricks Workflows, or Airflow
* Exposure to containerisation tools (Kubernetes, Docker)
* Experience with data cataloguing tools
* Familiarity with unit testing in data pipelines
* Awareness of MLOps practices