Overview
Edinburgh on-site, 3 days per week. 6 months (likely extension). £550 - £615 per day outside IR35. Primus is partnering with a leading Financial Services client who are embarking on a greenfield data transformation programme. The current processes offer limited digital customer interaction, and the vision is to modernise these processes by building a modern data platform in Databricks, creating a single customer view across the organisation, and enabling new client-facing digital services through real-time and batch data pipelines.
Responsibilities
* Design and build scalable data pipelines and transformation logic in Databricks
* Implement and maintain Delta Lake physical models and relational data models
* Contribute to design and coding standards, working closely with architects
* Develop and maintain Python packages and libraries to support engineering work
* Build and run automated testing frameworks (e.g. PyTest)
* Support CI/CD pipelines and DevOps best practices
* Collaborate with BAs on source-to-target mapping and build new data model components
* Participate in Agile ceremonies (stand-ups, backlog refinement, etc.)
Essential Skills
* PySpark and SparkSQL
* Strong knowledge of relational database modelling
* Experience designing and implementing in Databricks (DBX notebooks, Delta Lakes)
* Azure platform experience
* ADF or Synapse pipelines for orchestration
* Python development
* Familiarity with CI/CD and DevOps principles
Desirable Skills
* Data Vault 2.0
* Data Governance & Quality tools (e.g. Great Expectations, Collibra)
* Terraform and Infrastructure as Code
* Event Hubs, Azure Functions
* Experience with DLT / Lakeflow Declarative Pipelines
* Financial Services background
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology
Industries
Technology, Information and Internet
#J-18808-Ljbffr