One of our clients in the Trading & Market Making space is looking for an engineer to serve as a frontline point of contact for traders, researchers and internal users of data platforms within the London based trading team.
This role sits in the nexus between Data Engineering and Reliability & Operations Engineering, and is an excellent opportunity to work in a dynamic, impactful function within one of the most cutting edge trading environments in the city!
What You'll Do:
* Serve as frontline POC for traders, internal users and research teams for everything relating to data reliability.
* Investigate and resolve data quality concerns and freshness anomalies.
* Monitor ingestion pipelines, processes and real time feeds.
* Triaging and addressing alerts promptly to minimize impact on trading.
* Manage relationships with external vendors and resolve upstream issues, specification changes and ensure accurate delivery of datasets.
* Document and relay clear updates to users during production events.
Technical responsibilities:
* Ingest, configure and operationalise new datasets.
* Develop and maintain ETL/ELT data pipelines to feed real time trading and research systems.
* Implement data quality checks, anomaly detection and monitoring frameworks.
* Build and maintain high performance API's to expose market and reference data to trading, research and analytics platforms.
Who you are:
* 3+ years in Data Engineering, SRE, SWE or Data Ops roles in high performance, time sensitive environments.
* Strong Python proficiency, including exposure to libraries such as Pandas, Arrow, and Spark.
* Strong grasp of data modelling, normalization and API development for large scale analytical or trading systems.
* Experience with lakehouse architectures (Databricks ideally, or Delta Lake).
* Exposure to real time and historical market data within Fixed Income, ETFs or Equities would be excellent.
* Operational instincts - extreme agency and ownership over issues, tackling issues with initiative.