Role / Job Title:Data Ops EngineerWork Location:Norwich 3 Days (Flexible)Duration of Assignment:06 MonthsThe RoleA Data Ops Engineer is responsible for designing, automating, and optimizing data pipelines and ensuring smooth data flow across the organization.The role bridges data engineering, operations, and DevOps practices to deliver reliable, high quality, and timely data for analytics, reporting, and business applications.Your Responsibilities1. Data Pipeline Development & Maintenance
* Design, build, and maintain automated, scalable data pipelines (batch & real time)
* Optimize ETL/ELT jobs for performance, reliability, and cost efficiency
* Ensure data pipelines meet SLAs, quality standards, and security guidelines
2. Data Platform Operations
* Manage and monitor data platform operations using DataOps/DevOps practices
* Ensure high availability and reliability of data platforms (cloud or on prem)
* Troubleshoot pipeline failures and perform root cause analysis (RCA)
3. Automation & CI/CD
* Implement CI/CD pipelines for data workflows
* Automate testing, deployment, and monitoring for data services
4. Data Quality & Governance
* Implement data validation, profiling, and quality checks
* Work with data governance teams to enforce metadata standards, lineage, and catalogs
5. Collaboration
* Collaborate with data engineers, BI teams, analysts, and product teams
* Translate business requirements into scalable data solutions
Your ProfileEssential Skills / Knowledge / ExperienceTechnical Skills
* Strong experience with ETL/ELT tools (e.g., Informatica, Talend, DBT, Airflow, ADF)
* Proficiency in SQL, data modeling, and performance tuning
* Hands on with cloud platforms (Azure / AWS / GCP) and data services
* Understanding of DevOps tools: Git, Jenkins, Docker, Kubernetes (optional)
* Familiar with messaging and streaming: Kafka, Event Hub, Kinesis
* Experience with monitoring tools (e.g., Grafana, Prometheus, CloudWatch)
Soft Skills
* Strong analytical and problem solving abilities
* Excellent communication and stakeholder management
* Ability to work in cross functional, agile environments
Preferred QualificationsTechnical Skills
* Strong experience with ETL/ELT tools (e.g., Informatica, Talend, DBT, Airflow, ADF)
* Proficiency in SQL, data modeling, and performance tuning
* Hands on with cloud platforms (Azure / AWS / GCP) and data services
* Understanding of DevOps tools: Git, Jenkins, Docker, Kubernetes (optional)
* Familiar with messaging and streaming: Kafka, Event Hub, Kinesis
* Experience with monitoring tools (e.g., Grafana, Prometheus, CloudWatch)
Soft Skills
* Strong analytical and problem solving abilities
* Excellent communication and stakeholder management
* Ability to work in cross functional, agile environments
JBRP1_UKTJ