A large global organisation are looking to hire a Senior Data Analyst to lead end-to-end data workflows from requirement to delivery including data product creation and secure data transfer from Google Cloud Platform to PostgreSQL.
Scroll down the page to see all associated job requirements, and any responsibilities successful candidates can expect.
This will be a full time permanent position.
The role will be on a Hybrid/Remote basis in Milton Keynes, UK.
Key Responsibilities/Skills
* Develop & Schedule SQL Views via DAGs
* Design and implement SQL views aligned with business needs, prioritizing clarity, reusability, and efficiency.
* Build and manage workflow orchestrations (e.g., Airflow DAGs) to automate those views, ensuring reliable execution on daily, weekly, or customized schedules.
* Execute Cross‑Platform ETL with AWS Glue
* Develop, deploy, and maintain AWS Glue jobs to extract data from GCP (such as BigQuery or GCS) and load it into PostgreSQL.
* Set up secure connectivity, schedule jobs via cron or trigger mechanisms, and ensure data pipelines are reliable and idempotent.
* Monitor, Troubleshoot & Resolve Incidents
* Continuously oversee ETL workflows in Airflow and AWS Glue, proactively responding to alerts and errors.
* Conduct root cause analysis for pipeline failures—whether due to schema mismatches or performance bottlenecks—and apply robust fixes. Document resolutions to strengthen system resilience.
* Design, Build, & Govern Data Products
* Architect, construct, and maintain reusable data products, embedding clean datasets, metadata, governance policies, and clearly defined data contracts.
* Ensure compliance with FAIR principles—data being Findable, Accessible, Interoperable, and Reusable—and enforce robust access controls in collaboration with governance stakeholders.
* Translate Requirements into Technical Designs
* Accumulate and analyze requirements via stakeholder engagement, user stories, or use cases.
* Convert these into detailed design artifacts, including architecture diagrams, data models, and specifications for development.
* Optimize Performance Across the Stack
* Continuously refine ETL pipelines, SQL logic, and data workflows to boost efficiency and scalability. Techniques may include indexing, partitioning, caching, or employing materialized views to improve query speed.
* Lead Migration from hh360 to BigQuery
* Architect and drive a seamless migration strategy to move data and pipelines from the legacy hh360 system into Google BigQuery.
* Employ iterative migration patterns for safe data transfers, rigorous validation, and phased deprecation of legacy infrastructure.
Remote working/work at home options are available for this role.