Senior Snowflake Implementation Consultant – Contractor Position
We are seeking a Senior Data Engineer with extensive infrastructure implementation experience, especially with Snowflake and connected applications to lead technical delivery of our Snowflake migration project. To help us migrate from an in-house developed data warehouse to Snowflake as our core data infrastructure, supporting multiple client and 1P integrations as well as our own reporting visualisation solution, Seer.
Key Responsibilities
* Design and implement a Snowflake environment in line with an agreed migration strategy.
* Plan how to map nested JSON documents into Snowflake, documenting trade‑offs and recommendations.
* Present architectural decisions and collaborate during planning stages.
* Configure Snowflake warehouses, roles, and security policies using Terraform or similar tools.
* Execute the migration of historical marketing data (TB scale) from AWS/Mongo to Snowflake, ensuring high data parity.
* Build the “connective tissue” between the Seer frontend and Snowflake, extending the existing Seer Data API (FastAPI).
* Optimize Snowflake query performance, including clustering and materialized views.
* Implement a security model that maps Seer’s user management to Snowflake Row‑Level Security or Dynamic Data Masking.
* Migrate ELT pipelines (dbt, Airflow, Fivetran) to ingest data from 3rd‑party marketing APIs into Snowflake.
* Contribute to CI/CD, version control (Git), and best‑practice development standards.
Project Plan Overview
We envisage a 5‑7 month integration:
* Months 1‑2: Discovery, foundation, Snowflake environment setup, naming conventions.
* Months 3‑4: Migration & modelling, historical data movement, dbt models.
* Month 5: BI integration, Seer front‑end, dashboards, stakeholder validation.
* Month 6: Handoff, documentation, training, performance monitoring.
Qualifications
* Advanced Snowflake qualifications.
* Extensive experience in a data engineering or analytics engineering role, with marketing analytics exposure strongly preferred.
* Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or related.
* Strong programming skills in Python and SQL; familiarity with Java, Scala, or other languages.
* Proficiency in ETL tools and frameworks (Apache Airflow, dbt).
* Experience with data visualisation tools (Tableau, Power BI).
* Familiarity with cloud platforms (AWS, GCP, Azure) and data warehouse technologies.
* Knowledge of database design principles and query optimisation.
* Experience with Git and CI/CD pipelines.
* Excellent problem‑solving and communication skills.
Budget and Application
This project will have a fixed budget and timeline, agreed with the consultant. We would welcome a cost estimate based on the details and assumptions set out in the document. For shortlisted applicants, there will be an opportunity to discuss the plan in more detail and revise the plan, time‑scales and budget.
#J-18808-Ljbffr