We are currently recruiting for a Principle Data Engineer with ECS experience to join one of our Insurance clients on a 6-month contract.
Inside IR35
Hybrid- 2 days a week onsite in Ipswich
Responsibilities
* Own and evolve the target architecture for enterprise data platforms across Systems of Record and Analytical Platforms, translating it into deliverable engineering work
* Produce solution designs, patterns, and working examples (e.g., sample pipelines, reference notebooks) covering
* Act as a technical authority on architectural trade offs, helping teams balance timeliness, consistency, resilience, simplicity, and cost
* Create, document, and help implement engineering standards and best practices (through templates, CI/CD checks, and working examples) for
* Lead platform modernisation by evaluating, adopting, and migrating to newer, more cost effective Microsoft first capabilities; retire legacy or over engineered ETL approaches and decommission redundant components
* Promote AI driven development practices (e.g., AI assisted code generation, test creation, documentation, and refactoring) with appropriate guardrails around security, correctness, and maintainability
* Enable integration with AI platforms by shaping data products for AI use cases and defining patterns for secure access to enterprise data
* System of Record and Analytical Platform Boundaries
* Define consistent patterns for how Systems of Record expose data for analytical consumption
* Ensure Analytical Platforms provide trusted, explainable representations of enterprise data
Experience
* Significant hands on experience designing, building, and operating enterprise scale data platforms using Microsoft first cloud technologies.
* Able to deliver independently in a contractor context: shape work from problem statements, manage priorities, and drive to production outcomes with minimal
* Hands‑on experience with relevant data platform services and technologies, such as SQL (e.g., SQL Server/Azure SQL), Microsoft Fabric (Lakehouse/Warehouse, Data Factory), and Databricks (Spark)
* Advanced SQL skills and experience working with complex datasets
* Demonstrated ability to simplify and rationalise existing architectures
* Comfortable working across multiple teams and stakeholder groups with varying priorities
* Experience with modern data engineering practices such as version control, CI/CD, automated testing, and infrastructure‑as‑code
* Experience integrating data platforms with AI services and tools (e.g., Azure OpenAI or equivalent), including secure data access patterns for analytics and AI workloads
* Working knowledge of a general‑purpose programming language used in data engineering (e.g., Python) and scripting/automation
* Strong awareness of security, access control, and governance concepts for enterprise data platforms
#J-18808-Ljbffr