Requirement
* Several years of relevant work experience
* Take end-to-end responsibility to build, optimize and support of existing and new data products towards the defined target vision
* Be a champion of DevOps mindset and principles and able to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform).
* Evaluate and drive continuous improvement and reducing technical debt in the teams
* Design and implement efficient data models, data pipelines that support analytical requirements. Good understanding of different data modelling techniques and trade-off
* Should have experience with Data Modelling
* Experience in data query languages (SQL or similar). Knowledge of ETL processes and tool
* Experience in data centric and API programming (for automation) using one of more programming languages Python, Java /or Scala.
* Knowledge of NoSQL and RDBMS databases
* Experience in different data formats (Avro, Parquet)
* Have a collaborative and co-creative mindset with excellent communication skills
* Motivated to work in an environment that allows you to work and take decisions independently
* Experience in working with data visualization tools
* Experience in GCP tools – Cloud Function, Dataflow, Dataproc and Bigquery
* Experience in data processing framework – Beam, Spark, Hive, Flink
* GCP data engineering certification is a merit
* Have hands on experience in Analytical tools such as powerBI or similar visualization tools
* Exhibit understanding in creating intermediate-level DAX measures to enhance data models and visualizations
* Have understanding of Microsoft excel functions such as: power pivot, power query. Tabular Editor, DAX etc.
* Fluent in English both written and verbal
The candidate:
* Partner with retail business units (e.g., merchandising, supply chain, stores, digital) to design and deliver domain-aligned data products that power analytics and machine learning initiatives.
* Translate complex retail business needs into technical requirements and proactively identify opportunities for data-driven innovation.
* Lead the development and deployment of data mesh architecture, ensuring federated governance, discoverability, and self-serve capabilities.
* Design and build scalable data pipelines using GCP (BigQuery, Dataflow, Cloud Composer, Cloud Functions) and orchestrate transformations using DBT.
* Develop modular, reusable DBT models for core retail metrics such as inventory accuracy, sales trends, promotions performance, and customer loyalty.