Overview
Join to apply for the Data Analytics Engineer role at Huspy
The Main Event: What You’ll Drive, Build, and Own
* Architect our Data's Future: Design, build, and own the transformation layer of our data stack using dbt and Snowflake, creating clean, reliable, and scalable data models that serve as the single source of truth.
* Guarantee Data Trust and Integrity: Develop automated reconciliation processes to validate that raw data coming from Kafka matches our core application portals, ensuring every number is trusted and establishing a verified Single Source of Truth for each dataset.
* Build a Self-Serve Platform: Empower business users and analysts by developing curated tables and connecting them to our BI tool, Looker, enabling true self-service analytics.
* Automate for Quality and Scale: Implement robust CI/CD pipelines and GitHub automations (e.g., required tests, SQL linting) that validate pull requests before merging, enforcing high standards and allowing the team to scale confidently.
* Implement Modern Data Governance: Develop and implement strategies for handling Personally Identifiable Information (PII) and manage the full data lifecycle (from ingestion to archival), ensuring our platform is secure, compliant, and cost-effective.
* Optimize for Scale and Cost: Develop a deep understanding of our Snowflake usage and AWS costs, implementing best practices to ensure our platform is as efficient as it is powerful.
The Perfect Match: What It Takes to Succeed at Huspy
* Proven Experience: 4+ years in an Analytics Engineering or a data modeling-heavy Data Engineering role.
* dbt is Your Superpower: Expert-level proficiency with dbt (Core or Cloud). You can speak fluently about macros, packages, testing, and project structure.
* SQL & Data Warehousing Master: Advanced SQL skills and deep experience with a cloud data warehouse like Snowflake (preferred).
* A Modeler’s Mindset: Strong, demonstrated understanding of dimensional modeling concepts (e.g., Kimball methodology) and the ability to design data models that are both comprehensive and easy to understand.
* Pragmatic Programmer: Proficiency in Python for scripting, automation, and data pipeline development.
* Cloud Native: Hands-on experience with cloud platforms, preferably AWS.
* Excellent Communicator: You can translate complex technical concepts to non-technical stakeholders and understand how to turn business requirements into robust data models.
The Edge: What Will Make You Stand Out
* Experience building and maintaining BI layers in tools like Looker (LookML).
* Experience working with event streaming data from Kafka.
* Familiarity with data governance tools.
* A keen eye for cost optimization in a cloud data warehouse environment.
Work Setup: Hybrid in Dubai or Madrid, remote anywhere in the EU. Or we offer relocation support to Dubai or Madrid, if preferred.
#J-18808-Ljbffr