Overview
A growing digital asset company dedicated to building robust data infrastructures to support market-leading technology products.
The Role
Develop scalable data pipelines and processing systems used across the business to drive analytics and decision-making.
Key Responsibilities
* Architect and maintain ETL processes and data stores.
* Collaborate with stakeholders to deliver quality data solutions.
* Ensure data is accurate, secure, and performant.
* Optimize infrastructure for data processing and storage.
* Document systems and enforce data governance.
About You
* Experienced in Python, SQL, big data tools (Spark/Kafka).
* Familiar with cloud-native architectures (AWS/GCP/Azure).
* Comfort working with large-scale data in financial contexts.
* Strong problem-solving and teamwork capabilities.
Why Join
* Influence key data initiatives impacting global markets.
* Work in a continuously improving and agile culture.
* Access cutting-edge technologies and training.
If you have any further questions or want to hear more about the role, simply apply or contact Brendan McCrory directly on LinkedIn or WhatsApp.
#J-18808-Ljbffr