 
        
        Overview
We are constantly evolving our workflows and are committed to investing in cutting-edge technology. If you are passionate about building and deploying data-centric systems on a major cloud platform and want to make a tangible impact, you will thrive here. You will have the opportunity to contribute ideas and grow with a team that is shaping the future of our data infrastructure.
Responsibilities
 * Opportunity to Build and Maintain Data Pipelines: Build, maintain, and improve data pipelines and ETL/ELT processes.
 * Work with Data Warehousing Solutions: Contribute to data models and optimize queries to ensure data is accessible and performant for analytics teams.
 * Develop and Monitor Data Workflows: Develop, maintain, and monitor data ingestion and delivery pipelines using modern orchestration tools, ensuring data flows seamlessly and reliably.
 * Uphold Data Quality: Apply best practices for data quality, testing, and observability to ensure data delivered to stakeholders is accurate and trustworthy.
 * Collaborate on Data-Driven Solutions: Work with Data Scientists and R&D teams to provide clean and structured data needed to power research.
 * Support System Reliability: Monitor the health and performance of data systems; assist with root cause analysis, deploy fixes, and provide technical support.
 * Contribute to Technical Excellence: Continuously learn about new data technologies, test and implement enhancements to the data platform, and contribute to technical documentation.
 * The Role: Be a key member of the data platform team, helping to ensure its reliability, scalability, and efficiency.
Qualifications
 * Experience in Data Pipeline and ETL Development: Solid experience building and maintaining data pipelines, with a good understanding of ETL/ELT patterns.
 * Proficiency in Python and SQL: Hands-on Python for data processing and automation; solid SQL skills for querying and data manipulation.
 * Understanding of Data Modeling and Warehousing: Knowledge of data modeling techniques and data warehousing concepts.
 * Expertise with Cloud Platforms: Experience with major cloud providers (GCP, AWS, or Azure) and their core data services. Experience on GCP is a plus.
 * Familiarity with Big Data Technologies: Exposure to or experience with large-scale data processing frameworks (e.g., Spark).
 * Workflow Orchestration: Familiarity with data workflow orchestration tools (e.g., Airflow).
 * Infrastructure as Code (IaC): Interest in or exposure to IaC tools (e.g., Terraform).
 * Containerization: Familiarity with Docker and Kubernetes.
 * CI/CD for Data: Basic understanding of applying continuous integration/delivery to data workflows.
 * Data Quality and Testing: Interest in modern data quality and testing frameworks.
 * Version Control: Proficiency with Git.
Benefits
 * Comprehensive benefits package designed to support you as an individual, including 25 days annual leave, pension contribution, income protection and life assurance.
 * Additional health & wellbeing, financial benefits, and professional development opportunities.
 * Flexible working arrangements; hybrid model with 3 days per week in office or with clients. Please highlight your preferred arrangement in your application.
Equality and Inclusion
We are committed to equality, promoting a positive and inclusive working environment and ensuring diversity of people and views. We are a member of the Disability Confident scheme, certified as Level 1 Disability Confident Committed. We are dedicated to providing an inclusive and accessible recruitment process.
#J-18808-Ljbffr