Job Title: Analytics Engineer
Employment Type: Full time
Job Family: Technology
Location: Uxbridge, with flexibility to work from home.
Key Responsibilities
* Design and implement robust data models (e.g., star schema, snowflake schema, data vault).
* Develop and maintain dimensional data models to support BI and reporting requirements.
* Develop and implement analytics solutions to track key performance metrics.
* Design and build data pipelines to collect, process, and store large volumes of structured and unstructured data from various sources.
* Develop and maintain data quality checks and data validation processes.
* Develop and automate reports, dashboards, and data visualisations to communicate insights and trends effectively to stakeholders.
* Build and maintain tooling and frameworks to automate data pipelines for experimentation and ML modelling.
* Develop and maintain a deep understanding of product domains to ensure relevant events are produced and new entities and processes are integrated downstream in the Snowflake data platform model.
* Monitor and troubleshoot data pipeline issues and provide timely resolution.
* Work closely with product managers, data scientists, product analysts and software engineers to identify analytical requirements.
Must Have
* Bachelor's degree in computer science, engineering, mathematics, or a related field.
* 3+ years of experience in data/analytics engineering with a focus on building data pipelines.
* Proficiency in SQL and experience with one or more programming languages such as Python, Java.
* Experience with modern cloud data warehouse platforms such as Snowflake, BigQuery, Redshift or similar.
* Experience with cloud-based data platforms, particularly AWS or GCP.
* Experience with data warehousing, data modelling, and ETL development.
* Strong analytical & communication skills and an understanding of what drives the performance of a product to reach the company's commercial goals.
* Hands‑on experience with data visualisation tools such as Tableau, Looker, Streamlit or Power BI.
* Strong problem‑solving skills and attention to detail.
Valuable Skills
* Previous experience in similar analytics engineering roles with focus on product analytics and data modelling.
* Experience working with distributed event stores and stream‑processing platforms such as Kafka or Kinesis.
* Experience working with batch processing frameworks, such as DBT, Argo Workflows, Apache Airflow, etc.
* Familiarity with Docker, Kubernetes, Amazon EKS.
* Familiarity with Continuous Integration with GitHub Actions.
* Familiarity with Test‑Driven Development and XP.
Our Commitment To Equity, Diversity And Inclusion
At giffgaff we want to challenge the old way of doing things. People, and the way they work, define our culture and we encourage everyone to bring their whole selves to the gaff. That's why we believe in creating an equitable, fairer, and more inclusive business that champions different ideas and perspectives. We may be small but we're big on that caring, sharing thing and strive to create a supportive culture.
Benefits
In return for your outstanding efforts, you'll be rewarded with a competitive salary and excellent benefits.
#J-18808-Ljbffr