Location: Glasgow (Hybrid – 2–3 days on-site)
Contract Length: Until 31/12/2026
Day Rate: £447.50 (PAYE via Umbrella)
About the Role
We are seeking a highly skilled AWS Lead Data Engineer to join a long-term data engineering and cloud modernization programme. The role involves designing, building, and maintaining data systems—including pipelines, data lakes, and data warehouses—ensuring they are secure, scalable, and high performing.
Depending on skill set, you will either focus on developing AWS-based data engineering pipelines or support large-scale cloud migration efforts into Databricks.
UI development experience and AWS CloudFormation are essential requirements.
Key Responsibilities
* Lead the design and delivery of production-grade data engineering solutions.
* Build, optimise, and maintain large-scale data pipelines.
* Develop and enhance data lakes and data warehouses.
* Implement scalable SQL and Python/PySpark-based solutions.
* Use orchestration tools such as Airflow and AWS Step Functions.
* Analyse current processes and propose end-to-end technical improvements.
* Engage with stakeholders to translate business needs into technical designs.
* Support migrations into Databricks and modern data platforms.
* Work closely with data scientists to build and deploy machine learning models.
Required Technical Skills
* Strong data engineering background with hands-on AWS experience across:
* S3, Lambda, Glue, Step Functions, Athena, SageMaker, VPC, ECS, IAM, KMS
* AWS CloudFormation (mandatory)
* UI development experience (mandatory)
* Strong SQL, Python, and PySpark
* Experience with GitLab and unit testing
* Knowledge of modern data engineering patterns and best practices
Desirable (Databricks Track)
* Apache Spark
* Databricks (Delta Lake, Unity Catalog, MLflow)
* Experience with Databricks migration or development
* AI/ML understanding (nice to have)