Hackajob is collaborating with Accenture to connect them with exceptional professionals for this role.
Role: Data & ML Engineer
Location: Newcastle Upon Tyne
Levels: Senior Analyst, Specialist
Experience Level: 3+ years
Please Note: Due to the nature of client work you will be undertaking, you will need to be willing to go through a Security Clearance (Including BPSS) process as part of this role, which requires 5+ years UK address history at the point of application.
Hybrid Working: This role will require you to work from our Newcastle, Cobalt Business Park office for a minimum of 3 days per week.
Role Overview
As a Data & ML Engineer, you will design, build, and optimize scalable data pipelines and machine learning solutions. You’ll work with client teams to deliver intelligent data products, leveraging modern cloud and AI technologies.
Key Responsibilities
* Design and implement robust data pipelines and ML workflows using Python, SQL, Spark, and Databricks.
* Develop and deploy machine learning models (including NLP, deep learning, and agentic AI) in production environments.
* Integrate data from diverse sources, including streaming and batch ingestion, using Azure Data Factory, GCP Dataflow and AWS services.
* Apply data modelling concepts (e.g., medallion architecture) and ensure data quality and governance.
* Collaborate with DevOps teams to automate CI/CD and MLOps processes using Azure DevOps, Kubernetes, and Terraform.
* Visualize and communicate insights using Power BI, Tableau, and PowerApps.
* Mentor junior engineers and contribute to internal knowledge sharing.
* Ensure solutions meet security, compliance, and performance standards.
Qualifications
Core Data & AI Skills
* Python, SQL, Spark, Scala
* Machine Learning, NLP, Deep Learning, Prompt engineering, Agentic AI
* Data Architecture, Data Modelling, Data Engineering, Data Analysis
DevOps & Engineering
* CI/CD and MLOps (Azure DevOps, GitHub actions, Jenkins, Kubeflow etc)
* Infrastructure as Code (Terraform, Ansible etc)
* Containers (Kubernetes, Docker etc)
Certifications & Tools
* Data Visualisation and UI (Power BI, PowerApps, Tableau etc)
* Data Science Platforms (Databricks, Snowflake etc)
* Cloud certifications (Azure, AWS, GCP)
* Cloud Native Data Engineering (Azure Data Factory, AWS Glue, GCP Dataflow etc)
Other Requirements
* At least 3 years experience with large-scale data challenges (big data, distributed systems)
* Hands‑on with Infrastructure‑as‑Code (Terraform, Ansible)
* Agile and Waterfall project experience
* Strong stakeholder management and communication skills
* Security and compliance awareness
Desirable
* Experience in client‑facing roles
* Industry certifications (e.g., AWS Solution Architect, Azure Data Engineer, GCP Data Engineer)
* Experience mentoring or managing teams
What’s In It For You
* Competitive basic salary
* 25 days vacation per year
* Private medical insurance
* 3 extra days leave per year for charitable work of your choice
* Flexibility and mobility required to deliver this role; possible onsite client engagement
#J-18808-Ljbffr