Overview
Barclays Glasgow, Scotland, United Kingdom
Join Barclays as a Senior Data Engineer, where you’ll be responsible for designing, building, and optimizing data pipelines and frameworks that power the Enterprise Data Platform across AWS, Azure, and on-premises environments. This role requires strong hands-on engineering skills in data ingestion, transformation, orchestration, and governance, ensuring high-quality, secure, and scalable data solutions.
To Be Successful, You Should Have
* Data Pipeline Development & Orchestration – Expertise in building robust ETL/ELT pipelines using tools such as Apache Airflow (Astronomer), dbt/PySpark, Python, AWS Glue, Lambda, Athena, Snowflake, and Databricks.
* Data Transformation & Quality – Strong experience with dbt Core for transformations and testing, and in implementing data quality frameworks (e.g., dbt Expectations).
* Cloud & Hybrid Data Engineering – Hands-on experience with cloud-native services (AWS, Azure) and on-premises systems, including storage, compute, and data warehousing (e.g., Snowflake, Redshift).
Other Highly Valued Skills Include
* Metadata & Governance Tools – Familiarity with OpenMetadata, Alation, or similar tools for cataloging, lineage, and governance.
* DevOps & CI/CD for Data – Experience using GitHub Actions or similar tools for version control, CI/CD, and infrastructure-as-code for data pipelines.
* Observability & Cost Optimization – Knowledge of monitoring frameworks and FinOps practices for efficient resource utilization.
You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills.
This role is based in Glasgow.
Purpose of the role
To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Accountabilities
* Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data.
* Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
* Development of processing and analysis algorithms fit for the intended data complexity and volumes.
* Collaboration with data scientist to build and deploy machine learning models.
Seniority level
* Mid-Senior level
Employment type
* Full-time
Job function
* Information Technology
Industries
* Banking and Financial Services
Glasgow, Scotland, United Kingdom
#J-18808-Ljbffr