* Develop and maintain data pipelines for ingesting, transforming, and modelling structured and unstructured data.
* Work across cloud platforms (AWS, Azure, OCI, GCP) and Palantir Foundry environments to deliver and maintain robust data solutions.
* Support data modelling and warehousing activities using best practices in schema design and data governance.
* Contribute to ETL/ELT development, testing, and automation using tools such as Python, SQL, Airflow, or Azure Data Factory.
* Collaborate closely with analysts, scientists, and architects to ensure data solutions meet business needs.
* Participate in Agile delivery teams, including sprint planning, code reviews, and show-and-tells.
* Learn and apply DevOps and CI/CD principles to improve data deployment and reliability.
* Document work clearly and share knowledge with your team.
What We're Looking For
Essential Skills & Experience
* Interest and demonstrable experience in a data engineering, data analytics, or related technical capability. This could be previous employment, work experience or self-study and personal projects.
* Understanding of data fundamentals - including data pipelines, data modelling, and data quality principles.
* Awareness and experience working with different types of data.
* Experience with SQL and familiarity using Python for data transformation and automation.
* Hands-on experience (or strong interest) in one or more cloud platforms: AWS, Azure, or GCP.
* Interest in working with Palantir Foundry or similar enterprise data platforms.
* Good communication skills and ability to work as part of a multi-disciplinary, Agile team, and as an embedded team member in a customer environment.
* Eligible for UK Security Clearance (or 5+ years UK residency).