* Inside IR35 Contract - Up to £850 per day
* 3 Days per Week in London
* 6-Month Initial Contract (2 Years+ Project Scope)
* Start Date : April 2026
We are working alongside a leading Commodities Trading Firm who are searching for a Senior Data Engineer to join the business. You'll be building pipelines to support their trading and analytics platforms!
Project & Responsibilities
* Design and develop scalable Python-based data pipelines using Dagster to ingest, transform and integrate datasets into the enterprise data platform.
* Build ingestion frameworks to extract data from internal systems, trading platforms, APIs, and external providers, supporting both batch and real-time data flows.
* Implement streaming data integrations using Apache Kafka to enable event-driven processing and near real-time analytics.
* Land and manage curated datasets within AWS storage environments, ensuring data integrity, scalability and cost efficiency.
* Apply strong data governance practices including validation, lineage tracking, schema consistency and secure access controls across the data platform.
* Ensure production stability and operational reliability of data pipelines through monitoring, alerting, logging and proactive incident management.
* Break down complex data integration initiatives into structured technical deliverables, defining implementation strategies and mitigating delivery risks.
* Provide engineering leadership through code reviews, CI/CD best practices, mentoring engineers and promoting scalable, maintainable data platform solutions.
Technical Skill Set Required
* Strong Python data engineering experience, including orchestration with Dagster, advanced SQL, and data modelling.
* Experience building streaming and event-driven pipelines using Apache Kafka and integrating heterogeneous data sources.
* Solid background operating cloud-native data platforms on Amazon Web Services, with CI/CD, version control and production operations experience.
#J-18808-Ljbffr