Key Responsibilities
* Lead the technical delivery of complex data engineering projects, ensuring solutions are scalable, secure, and aligned with our delivery framework, and client goals.
* Design and build high-quality data pipelines and integration workflows, setting the technical direction and ensuring engineering best practices are followed throughout the development lifecycle.
* Collaborate with multidisciplinary teams, including a wide range of other roles, to shape solutions that meet both technical and business requirements.
* Mentor and support data engineering teams, fostering a culture of continuous improvement, knowledge sharing, and technical excellence.
* Support testing activities by ensuring pipelines are testable, observable, and reliable; work with QA and analysts to define test strategies, implement automated tests, and validate data quality and integrity.
* Contribute to technical planning, including estimation, risk assessment, and defining delivery approaches for client engagements and new opportunities.
* Engage with clients and stakeholders, translating data requirements into technical solutions and communicating complex ideas clearly and effectively.
* Champion engineering standards, contributing to the development and adoption of data engineering guidelines, design patterns, and delivery methodologies that contribute to our delivery framework.
* Stay current with emerging technologies, evaluating their relevance and potential impact, and promoting innovation within the firm and clients.
* Contribute to internal capability building, helping shape data engineering practices, tools, and frameworks that enhance delivery quality and efficiency.
Essential competencies
* Strong communicator, able to clearly articulate technical concepts to both technical and non-technical stakeholders.
* Confident working independently or as part of a collaborative, cross-functional team.
* Skilled at building trust with clients and colleagues, with a consultative and solution-focused approach.
* Demonstrated leadership and mentoring capabilities, supporting the growth and development of engineering teams.
* Organised and adaptable, with excellent time management and the ability to respond to shifting priorities.
* Self-motivated, proactive, and committed to continuous learning and improvement.
* Creative problem-solver with the ability to think critically and deliver innovative, practical solutions.
* Team-oriented, with a positive attitude and a strong sense of ownership and accountability.
Technologies, Methodologies and Frameworks:
* Direct delivery experience using cloud-native data services, specifically in Microsoft Azure, Fabric, Dataverse, Synapse, Data Lake, Purview.
* Deep expertise in data engineering tools and practices, including Python, SQL, and modern ETL/ELT frameworks (e.g., Azure Data Factory, Talend, dbt).
* Experience designing and implementing scalable data pipelines and integration patterns across structured and unstructured data sources (e.g., Azure SQL, MySQL, MongoDB).
* Familiarity with data governance, metadata management, and data quality frameworks.
* Practical experience applying DevOps principles to data engineering, including CI/CD pipelines, infrastructure as code, and monitoring.
* Solid understanding of data security and compliance best practices, including secure data handling and regulatory requirements (e.g., Secure by design).
* Comfortable working in agile, multi-disciplinary teams, contributing across the full delivery lifecycle and supporting continuous improvement.
* Adaptable and quick to learn new tools, frameworks, and technologies to meet the needs of diverse client projects.