The Role
This role will play a critical part in architecting and modernising cloud-native data platforms on Google Cloud Platform for a major financial services environment.
You will design scalable, secure, resilient, and costoptimised data solutions that serve highvolume, highregulatory workloads.
The role provides exposure to large-scale transformation programmes, modern engineering practices, mature data governance frameworks, and access to advanced GCP services, enabling you to influence enterprise-wide data strategies.
Your responsibilities:
• Architect end-to-end data solutions using GCP services such as BigQuery, Dataflow, Pub/Sub, Dataproc, GCS, and Composer.
• Design robust data models (conceptual, logical, physical) for complex risk, operations, analytics, and regulatory data domains.
• Build scalable ingestion and transformation frameworks with an emphasis on quality, lineage, metadata, and auditability.
• Define and enforce cloud architecture best practices, security patterns, IAM policies, and data protection standards.
• Lead cost optimisation, performance tuning, and reliability engineering across cloud data workloads.
• Collaborate with engineering squads, product owners, and stakeholders in an Agile environment to deliver technical roadmaps.
• Provide architectural governance, technical oversight, and reusable frameworks to development teams.
• Drive cloud modernisation and legacy-to-cloud migration initiatives across multiple data estates.
• Conduct POCs and evaluate emerging tools to strengthen the organisation’s cloud data capabilities.
• Ensure full compliance with regulatory, audit, and enterprise data governance requirements.
Essential skills/knowledge/experience:
Identifying the correct architecture patterns for workload deployment and ensuring governance
Managing data lineage and quality aspects of DPs
Understanding and defining SRE principles such as SLOs, SLIs, and SLAs for each workload
Supporting feature teams across the lab in optimizing FinOps for live workloads
Developing and deploying data pipelines using various GCP services
Implementing TDD for unit testing and DBB for functional testing
Reviewing and creating FinOps dashboards for the lab
Creating custom Dynatrace monitoring and alerting
Integrating Looker and enabling reporting for end-users within general guardrails
• Experience in designing high-volume ingestion, transformation, and analytics pipelines.
• Strong knowledge of data governance, lineage, metadata management, and regulatory controls.
• Demonstrated ability to work with cross-functional Agile teams and influence technical direction.
• Experience executing cloud migration or platform modernisation programs.
• Excellent architectural documentation, communication, and stakeholder engagement skills.
Desirable skills/knowledge/experience:
• GCP Professional Data Engineer or Cloud Architect certification
• Experience in BFSI domains such as AML, Fraud, Risk, Finance, or Regulatory Reporting.
• Exposure to real-time streaming, ML Ops, and advanced analytics architectures.
• Experience with platform observability tools and cloud cost management framework