Job Description Employment Type: engagement is inside IR‑35 through an umbrella company
Requirements 'must have':
Education: Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines
3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions
3+ years of experience in data development and solutions in highly complex data environments with large data volumes.
Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment.
Experience with code versioning tools (e.g., Git)
Knowledge of Linux operating systems
Familiarity with REST APIs and integration techniques
Familiarity with data visualization tools and libraries (e.g., Power BI)
Background in database administration or performance tuning
Familiarity with data orchestration tools, such as Apache Airflow
Pre...