Our client is a global investment management firm that utilizes long-short equities, long-only, fixed income and commodities futures strategies across its funds seek to add a experienced Data Engineer Risk Developer.
They have deep expertise in trading, technology and operations and attribute their success to rigorous scientific research. As a technology and data - driven firm, they design and build their own cutting - edge systems, from high- performance trading platforms to large- scale data analysis and compute farms.
The group manages the lifecycle of data used by investment for trading, backtesting and research. Working with quants and tech teams to integrate, process and serve data from vendors and public sources in the firm's data infrastructure (alpha data and cross-asset referential data).
Currently seeking new data developers to join their growing teams. You will be part of a learning culture, where teamwork and collaboration are encouraged, excellence is rewarded and diversity of thought and creations solutions are valued.
As a data developer, you would join a team of fast-paced Python development teams working closely with quantitative researchers to design, build, test and maintain data pipelines to onboard new data sets for research on new trading strategies. They own the entire pipelines starting with how data is ingested from the outside world, transform that data into timeseries of actionable insights and design the data models exposed to quantitative researchers. They also support these pipelines in productions during live trading and contribute to the data platform by building new frameworks, libraries, and full-stack services used to build data pipelines.
Responsibilities:
* Acquire a deep understanding of the data requirements of investment research teams to deliver the right solutions.
* Design, implement, test, optimize and troubleshoot Python data pipelines, frameworks and services.
* Collaborate with and influence technologists and investment researchers to ensure the data pipelines and platform meet constantly evolving requirements.
* Work closely with data operations and data platform developers to improve the data platform and reduce technical debt.
* Write and review technical documents, such as requirements docs for researchers, design docs to propose new platform solutions and production support runbooks.
Must-Have Qualifications:
* Bachelor's degree in Computer Science, Software Engineering or related subject.
* 2+ years' development experience with Python.
* Practical knowledge of commonly used protocols and tools used to transfer data (e.g. FTP, SFTP, HTTP APIs, AWS S3).
* Excellent communication skills.
Nice to have:
* 2+ years designing, testing, optimizing and troubleshooting data intensive applications.
* Experience in Data Engineer role (strong understanding of data engineering principles and best practices)
* experience in database technologies (PostgreSQL, SQL), and query optimization
* with data pipeline development and automation.
* coding skills (mainly with Python)
* knowledge (understanding of risk concepts such as VaR calculations) and good understanding of statistics with version control systems (Git)
* Experience with big data frameworks, databases, distributed systems, Cloud or Web development.
* Experience with any of these: SQL, JavaScript, TypeScript, React, C++, kdb+/q, Rust.