Role Title: Sr. Databricks Engineer
Location: Glasgow
Duration: 31/12/2026
Days on site: 2-3
MUST BE PAYE THROUGH UMBRELLA
Role Description:
We are currently migrating our data pipelines from AWS to Databricks, and are seeking
a Senior Databricks Engineer to lead and contribute to this transformation. This is a
hands-on engineering role focused on designing, building, and optimizing scalable data
solutions using the Databricks platform.
Key Responsibilities:
• Lead the migration of existing AWS-based data pipelines to Databricks.
• Design and implement scalable data engineering solutions using Apache Spark on
Databricks.
• Collaborate with cross-functional teams to understand data requirements and translate
them into efficient pipelines.
• Optimize performance and cost-efficiency of Databricks workloads.
• Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools.
• Ensure data quality and reliability through robust unit testing and validation
frameworks.
• Implement best practices for data governance, security, and access control within
Databricks.
• Provide technical mentorship and guidance to junior engineers.
Must-Have Skills:
• Strong hands-on experience with Databricks and Apache Spark (preferably PySpark).
• Proven track record of building and optimizing data pipelines in cloud environments.
• Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM,
and VPC.
• Proficiency in Python for data engineering tasks.
• Familiarity with GitLab for version control and CI/CD.
• Strong understanding of unit testing and data validation techniques.
Preferred Qualifications:
• Experience with Databricks Delta Lake, Unity Catalog, and MLflow.
• Knowledge of CloudFormation or other infrastructure-as-code tools.
• AWS or Databricks certifications.
• Experience in large-scale data migration projects.
• Background in Finance Industry.