In details, the position encompasses duties and responsibilities as follows:
An experienced Data Engineer is required for the Surveillance IT team to develop ingestion pipelines and frameworks across the application portfolio, supporting Trade Surveillance analysts with strategy and decision-making. The team has recently developed a new data lake house infrastructure and ingestion architecture and is entering a phase of hardening and re-engineering to enhance the speed, ease, and quality of ingestion across the portfolio.
The ideal candidate disposes of:
* Bachelor's degree in Information Systems, Computer Science, Engineering, or related field, or the equivalent combination of education, training and experience.
* 5+ years proven experience in a data engineering role demonstrating a strong track record of designing, building, and maintaining data pipelines and data architectures.
* Good communicator, with good interpersonal skills and the ability to convey technical issues to stakeholders.
* Experience in Agile methodologies and in working in SCRUM, KANBAN or similar.
* Demonstrated ability to pick up new skills and business concepts quickly.
* Ability to accept change and adapt to organisational change and shifting priorities.
* High work standards with a quality orientation and good attention to detail
* Working as part of a team whilst showing an ability to work on own initiative
* Good problem solving and ability to think quickly under pressure.
Skills:
* Programming Languages: Strong proficiency in Python and PySpark.
* Database Management: Expertise in SQL for data manipulation and querying.
* Data Orchestration: Experience with orchestration tools such as Apache Airflow or Dagster.
* Containerization: Familiarity with containerization technologies, specifically Kubernetes and Docker.
* Data Pipelines: Proven experience in designing and implementing data pipelines, working with big data technologies and architectures.
* Continuous Integration/Continuous Deployment (CI/CD): Knowledge of CI/CD practices and associated tools to automate testing and deployment processes.
* Financial Knowledge: Understanding of trade and financial concepts, particularly in relation to market data.
* Exchange Data: Includes real-time and historical market prices, trade details, order book information, market indicators, exchange metadata, news sentiment, and regulatory compliance data.
Desirable skills and experience
* Experience in Azure/Cloud services: including App Services, Azure Functions, Cosmos DB, and Synapse.
* Trade Surveillance: Familiarity with trade surveillance systems and methodologies, including monitoring for suspicious trading activities, detecting market manipulation, and ensuring compliance with regulatory requirements. Experience in analyzing trade patterns and anomalies to mitigate risks and enhance market integrity.
* Commodities experience: Understanding of operations or trading in the commodities context, particularly in Oil, Gas, Shipping, and related activities.
#J-18808-Ljbffr