The Client
This firm is a highly respected, technology-centric investment business operating across a broad range of asset classes. Their success is built on a mix of quantitative research, cutting-edge engineering and scalable data infrastructure. Engineers here play a central role: they design, build and maintain the platforms that underpin research, trading and large-scale data analysis.
It’s a collaborative environment where technical ownership is encouraged, engineering craft is valued, and impactful work directly supports sophisticated investment strategies.
What You'll Get
* Work on the design and build of fast, scalable market-data systems used across trading and research groups.
* Contribute to a modern engineering ecosystem: Python, cloud-native tooling, containerisation, large-scale data lake technologies.
* Partner closely with exceptional quantitative researchers, data engineers and traders.
* Influence architectural decisions and continuously refine pipeline performance.
* Join a culture that values rigour, curiosity and continual improvement.
* Benefit from strong compensation and long-term career growth within a high-performing engineering organisation.
Role Overview
* Design, implement, and maintain high-throughput, low-latency pipelines for ingesting and processing tick-level market data at scale.
* Operate and optimise timeseries databases (KDB, OneTick) to efficiently store, query, and manage granular datasets.
* Architect cloud-native solutions for scalable compute, storage, and data processing, leveraging AWS, GCP, or Azure.
* Develop and maintain Parquet-based data layers; contribute to evolving the data lake architecture and metadata management.
* Implement dataset versioning and management using Apache Iceberg.
* Collaborate closely with trading and quant teams to translate data requirements into robust, production-grade pipelines.
* Implement monitoring, validation, and automated error-handling to ensure data integrity and pipeline reliability.
* Continuously profile and optimise pipeline throughput, latency, and resource utilisation, particularly in latency-sensitive or HFT-like environments.
* Maintain clear, precise documentation of data pipelines, architecture diagrams, and operational procedures.
What You Bring
* 3+ years of software engineering experience, preferably focused on market-data infrastructure or quantitative trading systems.
* Strong Python expertise with a solid grasp of performance optimisation and concurrency.
* Proven experience designing, building, and tuning tick-data pipelines for high-volume environments.
* Hands-on experience with Parquet storage; experience with Apache Iceberg or similar table formats is a plus.
* Practical experience with containerisation (Docker) and orchestration platforms (Kubernetes).
* Strong background in profiling, debugging, and optimising complex data workflows.
* Experience with timeseries databases (KDB, OneTick) and/or performance-critical C++ components.
* Deep understanding of financial markets, trading data, and quantitative workflows.
* Excellent communication skills with the ability to articulate technical solutions to engineers and non-engineers alike.