Fundment is a fast-growing wealth infrastructure company, building on our cutting-edge digital investment system to transform the £3 trillion UK wealth management market. We are passionate about revolutionising the investment experience for financial advisers and their clients by combining our innovative proprietary technology with exceptional customer service. As we scale, we are growing our multi-disciplinary Data, Analytics & AI function to unlock the full potential of our data assets. We are looking to hire a Lead Data Engineer to join our team.
The Lead Data Engineer is a high-impact hands-on leadership role (80% hands on coding) with the opportunity to design and build Fundment’s data and analytics infrastructure from the ground up. Reporting to our Head of Data & Analytics, you will create the data foundations that allow us to deliver data-driven insights, automated systems and AI-powered products to drive business growth and enhance operational efficiency.
Key Responsibilities:
Data Platform Architecture & Strategy:
* Lead the implementation of a robust, scalable, and secure data platform architecture on Google Cloud Platform (GCP).
* Define and enforce technical standards, design patterns, and best practices for data ingestion, processing, storage, and consumption.
* Ensure our data infrastructure follows best practices across data governance, cataloguing, quality and lineage.
* Ensure the data platform adheres to all regulatory requirements (e.g., FCA, GDPR) and implement appropriate access control mechanisms.
Data Ingestion and Processing:
* Design and deliver complex data pipelines for both batch and real-time streaming data processing from both internal and external data sources.
* Build and optimise our query and transform capabilities across both structured and unstructured data.
* Define and implement a unified data and metrics framework to ensure consistent definitions and understanding of KPIs across the organization.
Data/ML Operations:
* Establish and lead the design of CI/CD processes and tooling for data pipelines, enabling scalable, automated, and high-quality delivery across environments.
* Implement data observability and data quality monitoring to ensure our data is continually able to support our operations and drive business growth.
* Design and build end-to-end MLOps pipelines for continuous improvement in model performance.
* Ensure that all data components are provisioned and controlled through Infrastructure As Code (IaC).
* Monitor, analyze, and optimise the cost efficiency of our GCP data plant.
Team Leadership & Mentorship:
* Build our data engineering capability and recruit other data engineers of the highest calibre.
* Provide technical guidance, mentorship, and code reviews to our multi-disciplinary team of analysts, data scientists and engineers.
* Translate requirements into technical specifications and project plans, overseeing execution from conception to production.
* Collaborate with cross-functional teams across the company to help both technical and non-technical users leverage our data, analytics and AI capabilities in an optimal and secure way.
Required Skills/Experience
* Experience: Proven experience (7+ years) in data engineering.
* Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
* Experience in a startup or high-growth environment.
Technical skills:
* Advanced proficiency in SQL and Python for data manipulation and analysis
* Experience in Google data infrastructure, including DataPlex, DataCatalog and BigQuery
* Experience in Google data processing, including DataStream, DataFlow, Cloud Composer
* Experience building and maintaining data transformation layers using dbt.
* Strong experience with data visualization tools, (e.g. Looker)
* Experience with data observability and monitoring
* Experience with containerization technologies including CloudRun and Docker.
* Experienced with Infrastructure as Code (e.g. Terraform)
* Experience with MLOps Pipelines
* Communication: Excellent written and verbal communication, presentation and interpersonal skills
Preferred Skills/Experience
* Knowledge of data privacy and AI regulation, preferably within financial services.
* Familiarity with real-time data processing and streaming analytics.
* Experience in GCP Vertex AI platform, including Vertex AI Pipelines, Model Registry, and Vertex AI Endpoints.
* Experience building the infrastructure that facilitates deployment and production tuning of ML models and AI systems (including LLMs).
* GCP Professional Data Engineer or Professional Machine Learning Engineer certifications.
Why join us?
Become part of our flexible, dynamic and supportive work environment, where our innovative team values your ideas and collaboration drives our success together. Make an impact from day one and challenge yourself to continually improve, raise standards and see how your work can contribute to future goals.
We are happy to consider any reasonable adjustments that applicants may need during the recruitment process.
Company Benefits
* Be part of a modern, inclusive, high-trust engineering culture
* Take ownership and ship code that directly improves client outcomes
* Work with a smart, friendly team that values balance, growth, and support
* Pension 6% employer contribution
* BUPA Private Health Insurance – fully paid for by the company, for you and your immediate family.
* Medicash Cashplan – fully paid for by the company, for you and your immediate family.
* Travel insurance – fully paid for by the company, for you and your immediate family.
* Life Assurance – 4 x base salary.
* Employee Assistance Programme
* 28 days annual leave plus bank holidays.
* Paid compassionate leave – up to 5 days per year.
* Enhanced paternity/maternity/adoption leave – 16 weeks at full pay after 12 months of service.
* Jury service – 10 days at full pay.
* Hybrid working arrangements – 3 days per week in the Fitzrovia office.
* Coaching & Counselling sessions
* Training Budget
* Annual pay review