If you’re looking to take an exciting new direction with your HSBC career, an internal move can open the door to many opportunities, allowing you to take on a new challenge, and develop your skills. Bring your knowledge of our brand to a new role and grow yourself further.
Technology teams in the UK work closely with our global businesses to help design and build digital services that allow our millions of customers around the world, to bank quickly, simply and securely. They also run and manage our IT infrastructure, data centres and core banking systems that power the world’s leading international bank. Our multi-disciplined teams include: DevOps Engineers, IT Architects, Front and Back End Developers, Infrastructure specialists, Cyber experts, as well as Project and Programme managers.
A move across the business allows you to continue to access tailored professional development opportunities, and our fantastic benefits packages.
Enterprise Technology is the Technology organisation responsible for the design, build and ongoing maintenance of the systems owned by the Group Functions (Risk, Compliance, Finance, Core Banking, Corporate Functions and Deputy COO). The organisation consists of over 8,000 people working in collaboration across 14 countries, to support over 3,000 applications.
The Reporting & Analytics stream sit within Colleague Experience Technology under the Enterprise Technology global function. The stream is responsible for HSBC’s HR and people related insights. We are currently recruiting Data Engineers to be part of a Group HR Data Analytics Platform project which will serve as a backbone to people related insights, Data science landscape for research and analytics and output data as a strategic data refinery/Data Backbone.
In this role you will:
* Design, develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy.
* Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration.
* Develop and optimize complex SQL queries and Python-based data transformation logic.
* Work with version control systems (GitHub, Azure DevOps) to manage code and deployment processes.
* Automate deployment of data pipelines using CI/CD practices in Azure DevOps.
* Ensure data quality, security, and compliance with best practices.
* Monitor and troubleshoot performance issues in data pipelines.
To be successful in this role you should meet the following requirements:
* Must have experience with Delta Lake and Lakehouse architecture.
* Proven experience in data engineering, working with Azure Databricks, PySpark, and SQL.
* Hands-on experience with Prophesy for data pipeline development.
* Proficiency in Python for data processing and transformation.
* Experience with Apache Airflow for workflow orchestration.
* Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes.
* Familiarity with GitHub and Azure DevOps for version control and CI/CD automation.
This role is based in Sheffield and offer hybrid working.
Being open to different points of view is important for our business and the communities we serve. At HSBC, we’re dedicated to creating diverse and inclusive workplaces - no matter their gender, ethnicity, disability, religion, sexual orientation, or age. We are committed to removing barriers and ensuring careers at HSBC are inclusive and accessible for everyone to be at their best. We take pride in being a Disability Confident Leader and will offer an interview to people with disabilities, long term conditions or neurodivergent candidates who meet the minimum criteria for the role.
If you have a need that requires accommodations or changes during the recruitment process, please get in touch with our Recruitment Helpdesk:
Email: hsbc.recruitment@hsbc.com
Telephone: +44 207 832 8500