Social network you want to login/join with:
Parameta Solutions - Data Engineer, London
Client: TP ICAP
Location: London, United Kingdom
Job Category: -
EU work permit required: Yes
Job Reference: 929f408c054f
Job Views: 6
Posted: 05.05.2025
Expiry Date: 19.06.2025
Job Description:
Role Overview
We operate a hybrid model where brokers provide business-critical intelligence to clients. It’s supplemented by proprietary screens for historical data, analytics and execution functionality. Globally, we’re the leading provider of proprietary over the counter pricing information and a unique source of data on financial, energy and commodities products. Our market data is independent, unbiased and non-position influenced. Our clients include banks, insurance companies, pension and hedge funds, asset managers, energy producers and refiners as well as risk and compliance managers and charities.
We are looking for a passionate and capable Data Engineer who wants to make a real impact and build software they can be proud of. We work in a collaborative, fast paced environment where you will design, build and deploy new systems and products.
Role Responsibilities
* Building performant batch and streaming data pipelines.
* Data warehouse development.
* Cloud based development.
* Improving CI/CD Processes.
* Maintaining data applications, pipelines and databases.
* Participating in daily stand ups and agile development teams.
* Writing unit, integration and data quality tests.
* Contributing to documentation and best practice guidelines.
* Staying up to date with current technology and techniques.
Experience / Competences
* Bachelor's degree in computer science, engineering, mathematics, or a related technical discipline.
* Experience working with Python and SQL, other languages like Java, C# or C++ are also useful.
* Experience with time-series market data is desirable.
* Able to write clean, scalable and performant code.
* Proven written and verbal communication skills including an ability to effectively communicate with both business and technical teams.
* Some knowledge of Linux and the command line.
* Understanding of ETL and event streaming e.g. Kafka.
* Snowflake, Kubernetes and Airflow experience is desirable.
* Experience with Amazon Web Services (AWS) would be beneficial.
* Basic knowledge of data science topics like machine learning, data mining, statistics, and visualisation.
#J-18808-Ljbffr