Principle Data Engineer
We are currently recruiting for a Principle Data Engineer with ECS experience to join one of our Insurance clients on a 6-month contract.
Inside IR35
Hybrid- 2 days a week onsite in Ipswich
1. Own and evolve the target architecture for enterprise data platforms across Systems of Record and Analytical Platforms, translating it into deliverable engineering work
2. Produce solution designs, patterns, and working examples (., sample pipelines, reference notebooks) covering.
3. Act as a technical authority on architectural trade offs, helping teams balance timeliness, consistency, resilience, simplicity, and cost
4. Create, document, and help implement engineering standards and best practices (through templates, CI/CD checks, and working examples) for:
5. Lead platform modernisation by evaluating, adopting, and migrating to newer, more cost effective Microsoft first capabilities; retire legacy or over engineered ETL approaches and decommission redundant components
6. Promote AI driven development practices (., AI assisted code generation, test creation, documentation, and refactoring) with appropriate guardrails around security, correctness, and maintainability
7. Enable integration with AI platforms by shaping data products for AI use cases and defining patterns for secure access to enterprise data
8. System of Record and Analytical Platform Boundaries
9. Define consistent patterns for how Systems of Record expose data for analytical consumption
10. Ensure Analytical Platforms provide trusted, explainable representations of enterprise data
Experience
11. Significant hands on experience designing, building, and operating enterprise scale data platforms using Microsoft first cloud technologies.
12. Able to deliver independently in a contractor context: shape work from problem statements, manage priorities, and drive to production outcomes with minimal
13. Hands-on experience with relevant data platform services and technologies, such as SQL (., SQL Server/Azure SQL), Microsoft Fabric (Lakehouse/Warehouse, Data Factory), and Databricks (Spark)
14. Advanced SQL skills and experience working with complex datasets
15. Demonstrated ability to simplify and rationalise existing architectures
16. Comfortable working across multiple teams and stakeholder groups with varying priorities
17. Experience with modern data engineering practices such as version control, CI/CD, automated testing, and infrastructure-as-code
18. Experience integrating data platforms with AI services and tools (., Azure OpenAI or equivalent), including secure data access patterns for analytics and AI workloads
19. Working knowledge of a general-purpose programming language used in data engineering (., Python) and scripting/automation
20. Strong awareness of security, access control, and governance concepts for enterprise data platforms
Guidant, Carbon60, Lorien & SRG - The Impellam Group Portfolio are acting as an Employment Business in relation to this vacancy.