Overview
Onsite: 3 days per week – mandatory / Remote
Start: ASAP
Duration: 12-24 months (extension very likely)
Language: English (must-have)
What you’ll do
* Build and maintain scalable data pipelines for large, complex datasets
* Work hands-on with structured & unstructured data across multiple sources
* Ensure data quality, reliability and performance end-to-end
* Collaborate closely with AI / ML teams to deliver model-ready datasets
* Support data governance, lineage and documentation (pragmatic, not bureaucratic)
What you bring
* Strong Data Engineering background (senior level)
* Experience with big data environments (e.g. Spark, distributed systems, cloud data platforms)
* Solid understanding of data modelling, ETL/ELT, pipelines
* Comfortable working onsite in Edinburgh 3x/week
Nice to have
* Exposure to ML / AI data workflows
* Experience with orchestration tools (Airflow / similar)
* Cloud experience (AWS / Azure / GCP)
What we need
If you are interested and available – or if you know someone you would recommend – I’d be happy to receive your updated CV with a short email incl. contact details to: Joseph@WorkGenius.com.
Please always include:
* Availability start date
* Hourly rate (Edinburgh & Remote)
* A short 2–3 line summary explaining why your background is a good fit for this project
#J-18808-Ljbffr