Salary: £50,000 - 60,000 per year Requirements: Hands-on experience in Snowflake data warehouse development and optimisation. Strong SQL skills for querying, transformation and performance tuning. Experience building and managing ETL/ELT pipelines. Proficiency with at least one scripting/programming language (e.g., Python). Familiarity with modern data engineering tools like dbt, Airflow, Prefect, or similar is a plus. Knowledge of cloud platforms (AWS / Azure / GCP). Understanding of data modelling, quality controls and best practices. Degree in Computer Science, Data Engineering, IT or a related field (or equivalent experience). Snowflake certifications or relevant cloud/data engineering certifications are advantageous. Responsibilities: Design, build, and maintain scalable ETL/ELT data pipelines using Snowflake. Administer, optimise and support the Snowflake data platform for performance and cost efficiency. Ingest, transform, and integrate data from multiple sources (e.g., GA4, internal systems). Develop and maintain data models to support analytics, reporting and business use cases. Ensure high data quality, monitoring, testing and documentation of pipelines and models. Collaborate with BI, analytics and engineering teams to ensure data meets business needs. Support data governance, security, compliance and best practices in data engineering. Technologies: Airflow AWS Azure Cloud Data Warehouse ETL GCP Support Python SQL Security Snowflake dbt More: We are looking for a talented Snowflake Data Engineer to join our growing data team based in Brighton. Youll play a key role in enhancing our data platform, automating data workflows and enabling data-driven decision-making across the business. We value diversity and inclusion and are committed to providing a supportive environment for all individuals, including those with disabilities. We are dedicated to fostering a culture of flexible working options. last updated 5 week of 2026