ALPTech is an AI-native consulting and risk analytics firm. We deliver high-impact solutions to the banking, insurance, and investment sectors. We are looking for an intellectually curious Part-Time Data Engineer to join our technical team.
This role is specifically designed for current PhD candidates who wish to apply their advanced problem-solving skills to real-world financial data architecture challenges.
The Role
We are seeking a technically rigorous PhD student with a background in Computer Science, Statistics, or related STEM fields. You will play a critical role in building and optimizing the data infrastructure that powers our risk analytics and consulting projects.
You will work closely with our Quant and Consulting teams to ensure our data pipelines are robust, scalable, and accurate. This is a hands-on engineering role requiring strong SQL capabilities and an architectural mindset.
Key Responsibilities
Data Architecture & Optimization
* Take ownership of the data architecture design, ensuring systems are built for high performance and long-term scalability.
* Analyze existing database schemas and query performance; implement optimizations to handle increasing data loads efficiently.
ETL Development & Pipeline Management
* Design, build, and maintain robust ETL pipelines to ingest, transform, and store data from various external sources.
* Collaborate with Quant/AI teams to transform raw financial data into model-ready features. This includes building logic for time-series aggregation, handling missing data, and calculating derived risk metrics.
* Write complex, high-efficiency SQL queries and manage stored procedures to support data extraction and reporting requirements.
Data Integration & Integrity
* Develop API integrations to connect disparate services and automate data flows.
* Manage and process both structured (relational) and unstructured data formats.
* Implement rigorous data validation checks to ensure accuracy and consistency across all data assets.
Candidate Profile
Education
* Must be currently enrolled in a PhD program in Computer Science, Statistics, Engineering, or a related quantitative discipline, with a Top 10-ranked UK University
Required Skills & Experience
* 1–2 years of hands-on experience in data engineering or database management (academic research projects or industry experience are both accepted).
* Experience with Python or R for data manipulation.
* Cloud Platform Familiarity: Experience working with cloud data ecosystems such as AWS, Azure, or GCP.
* Advanced SQL Proficiency: Demonstrated ability in query optimisation, performance tuning, and writing complex stored procedures.
* Feature Engineering & ETL: Experience in designing data pipelines and feature engineering (feature creation, selection, and transformation) for analytical or machine learning purposes.
* BI Tooling: Proficiency with visualisation and business intelligence tools such as Power BI or Tableau.
* Analytical Mindset: A strong focus on translating technical data issues into tangible business outcomes.
Preferred Qualifications (Nice to have)
* Basic understanding of financial data, quantitative finance, or risk management concepts.
What We Offer
* This is a paid position with an hourly rate/salary.
* High-performing candidates will be considered for a full-time role upon graduation or completion of the internship.
* Opportunity to apply advanced technical skills in a high-level commercial setting (Risk & Finance).
* Collaboration with a senior team of experts in AI, Risk, and Quantitative Analytics.