Salary: £70,000 - 70,000 per year Requirements:
* To be successful in this role, it is essential that I have:
* Experience building and automating data pipelines in finance, focusing on making data systems run smoothly and automatically using a DataOps or DevOps approach.
* Skills to demonstrate how to automate and manage data systems so they run smoothly and can scale effectively.
* Proficiency with tools like AWS (S3, Glue, Redshift, SageMaker) or other cloud platforms.
* Familiarity with Docker, Terraform, GitHub Actions, and Vault for managing secrets.
* Experience in coding with SQL, Python, Spark, or Scala to work with data.
* A solid background with databases used in Data Warehousing, Data Lakes, and Lakehouse setups, both structured and unstructured data.
* Experience in testing and testing standards.
* Experience in leading and coaching individuals, providing ongoing oversight over the development of data products.
* Desirable experience would include:
* Experience of working in an Agile Team, preferably Safe.
* Familiarity with specific tools such as Qlik Replicate, Qlik Compose, DataBricks, Informatica, or SAS.
* An understanding of data modeling methodologies like Kimball, Data Vault, or Lakehouse.
* Knowledge of Data Science, AI, and Machine Learning ways of working.
Responsibilities:
* In this role, I will be responsible for:
* Driving automation and CI/CD practices across data pipelines and infrastructure.
* Building robust, scalable, and secure infrastructure that supports our data platform strategy.
* Leading and mentoring data engineers while fostering a high-performance, collaborative environment.
* Playing a central role in designing and developing a new cloud-based Data Ecosystem.
* Working closely with stakeholders to analyze requirements, design test strategies, and ensure data quality and integrity through comprehensive testing.
* Collaborating in an agile environment with product managers, analysts, and developers to deliver data products that create tangible business value.
Technologies:
* AI
* AWS
* CI/CD
* Cloud
* Data Vault
* Databricks
* DevOps
* Docker
* GitHub
* Informatica
* Support
* Machine Learning
* Python
* Qlik
* SAS
* SQL
* Scala
* Spark
* Terraform
* Backbone
More:
As I immerse myself in the world of data engineering, I will join The Coventry as a Senior DataOps Engineer, inspiring and coaching our existing Data Engineering teams. This is more than just a technical position; I am part of a mission to build a culture of innovation, ownership, and continuous improvement. If I enjoy solving problems, automating processes, improving systems with Infrastructure as Code (IaC), and collaborating with great people, I would love to hear from you.
At The Coventry, we pride ourselves on being one of the largest building societies in the UK, united by our mutual goal of improving the lives of others. We are recognized as a Great Place to Work, offering a culture of reward and recognition, along with comprehensive wellbeing support. We have a hybrid working arrangement in place, allowing flexible working patterns.
We are committed to diversity and equality, and we aim to celebrate it in all forms. We are proud to be a Disability Confident Committed Employer, offering interviews to every disabled applicant who meets the minimum criteria for our vacancies.
Join us to build more than just a career; together, we can achieve more.
last updated 42 week of 2025