We are looking for an experienced Technical Architect to design and evolve large-scale, cloud-based data platforms. The role focuses on AWS data ecosystems with growing adoption of Microsoft Fabric, supporting enterprise clients through platform build, migration, and optimisation.
Responsibilities
* Define architecture for data ingestion, processing, and access layers
* Design scalable, resilient data platforms (hub-and-spoke, lake/lakehouse)
* Lead implementation of batch and real-time pipelines (Kafka, APIs, SFTP)
* Architect solutions using AWS (S3, EMR, Glue) and PySpark/Python
* Support cloud migrations and platform modernisation
* Ensure best practices in CI/CD, Terraform, observability, and governance
* Handle schema evolution, data quality, and performance optimisation
* Collaborate across engineering, DevOps, and business teams
* Provide technical leadership and mentoring
Qualifications
* Active SC Clearance (essential)
* Strong experience with AWS data platforms (S3, EMR, Glue)
* Expertise in data lake / lakehouse architectures
* Experience with Kafka and event-driven architectures
* Strong Python & PySpark skills
* Experience with Terraform and CI/CD pipelines
* Proven track record designing large-scale data solutions
Desirable Qualifications
* Experience with Microsoft Fabric (highly preferred)
* Exposure to Azure / Databricks
* Experience in enterprise-scale environments
#J-18808-Ljbffr