At ANS, the Data Engineer plays a vital role in enabling data-driven decision-making by developing and maintaining reliable data pipelines, supporting data integration efforts, and working closely with senior engineers and architects to ensure technical alignment. While not responsible for leading solution design, they contribute hands-on expertise to implementation, data transformation, and performance optimisation‑helping convert raw data into trusted, usable assets that support internal teams and customer outcomes.
Responsibilities
* Deliver high-quality data solutions by building and optimising data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud-based data sources.
* Work closely with Data Architects and Senior Data Engineers to implement technical designs and contribute to solution development.
* Collaborate with customer‑side data engineers to ensure smooth integration and alignment with business requirements.
* Focus on task execution and delivery, ensuring timelines and quality standards are met.
* Follow engineering best practices including CI/CD via Azure DevOps, secure data handling using Key Vault and private endpoints, and maintain code quality.
* Participate in Agile ceremonies such as standups, sprint planning, and user story refinement.
* Document solutions clearly for internal use and knowledge sharing.
* Troubleshoot and resolve technical issues across environments and subscriptions.
* Engage in continuous learning through certifications (e.g. DP‑600 and/or DP700, AI‑900, AI‑102, etc.) and development days.
* Contribute to the Data Engineer Guild by sharing knowledge, participating in discussions, and helping shape engineering standards and practices.
* Information security is considered, and employees have a duty and responsibility to adhere to business policies and procedures.
Qualifications
* Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric.
* Knowledge of Python and SQL.
* Experience in using metadata frameworks in data engineering.
* Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github.
* Base knowledge understanding of Azure networking and security in relation to the data platform.
* Experience of data governance and regulation, including GDPR, principle of least privilege, classification, etc.
* Experience of Lakehouse architecture, data warehousing principles, and data modelling.
* Familiarity with Power BI and DAX is a plus.
* Base knowledge of Azure Foundry.
#J-18808-Ljbffr