The Role We are looking for a hands-on Azure Data Engineer who will lead the final phase of our Client's cloud migration and design the enterprise-grade data platform from the ground up. This is a hybrid role with a strong technical focusblending architecture, automation, and data engineeringto empower s next generation of AI and BI capabilities.
About The Company The Company is a dynamic, global procurement consultancy operating across Europe, the US, and APAC. As they scale globally and accelerate their AI capabilities, they are completing their transition to the cloud and building a company-wide data platform to power insight-driven transformation for their consultants and clients
Required Skills & Experience Must-Haves:
3+ years of hands-on Azure engineering experience (IaaS ? PaaS), including Infra as Code.
Strong SQL skills and proficiency in Python or PySpark .
Built or maintained data lakes/warehouses using Synapse, Fabric, Databricks, Snowflake, or Redshift .
Experience hardening cloud environments (NSGs, identity, Defender).
Demonstrated automation of backups, CI/CD deployments, or DR workflows.
Nice-to-Haves:
Experience with Azure OpenAI, vector databases, or LLM integrations.
Power BI data modeling, DAX, and RLS.
Certifications: AZ-104, AZ-305, DP-203, or AI-102 .
Knowledge of ISO 27001, Cyber Essentials+, or SOC 2 frameworks.
Exposure to consulting or professional services environments.
Familiarity with the Power Platform .
Awareness of data privacy regulations (e.g., GDPR, CCPA).
Soft Skills Consultative mindset can turn business questions into technical outcomes.
Comfortable switching hats: architect, hands-on builder, and mentor.
Clear communicator, able to work effectively across time zones and teams.
Thrives in a small, high-trust, high-autonomy team culture.
Day-to-Day Responsibilities Infrastructure & Automation:
Deploy and manage infrastructure using Bicep/Terraform, GitHub Actions, and PowerShell/DSC .
Data Engineering:
Architect and implement scalable ETL/ELT solutions; model schemas, optimize performance, and apply lakehouse best practices.
Security & Resilience:
Implement best-practice cloud security (NSGs, Defender, Conditional Access), automate DR/backups, and run quarterly restore drills.
Collaboration:
Partner with AI Product Owners, Business Performance, and Data Analysts to translate business needs into robust data solutions.
Mentorship & Knowledge Sharing:
Act as a data SMEguiding system administrators and upskilling junior technical team members.
What You'll Achieve in Year 1 Months 312:
Design and build their Azure data lake using Synapse, Fabric, or an alternative strategy.
Ingest data from core platforms: NetSuite, HubSpot, and client RFP datasets.
Automate data pipelines using ADF, Fabric Dataflows, PySpark, or SQL .
Publish governed datasets with Power BI, enabling row-level security (RLS).
By Year-End:
Deliver a production-ready lakehouse powering BI and ready for AI/Gen-AI initiatives.
Position the business to rapidly scale data products across regions and services.
Whats in It for You Greenfield opportunity: Shape and deliver the first enterprise data platform.
Career growth: Scale with the company into Lead Data, Cloud, or Solution Architect roles.
Hybrid flexibility: Remote-first with 23 days/week onsite in Cardiff office .
Development: Funded certifications, dedicated R&D time, access to Company networks and resources.
TPBN1_UKTJ