Key Responsibilities
* Involved in entire development lifecycle, from brainstorming ideas to implementing elegant solutions to obtain data insights.
* Gather requirements, model and design solutions to support product analytics, business analytics and advance data science.
* Design efficient and scalable data pipelines using cloud‑native and open source technologies.
* Develop and improve ETL/ELT processes to ingest data from diverse sources.
* Work with analysts, understand requirements, develop technical specifications for ETLs, including documentation.
* Support production code to produce comprehensive and accurate datasets.
* Automate deployment and monitoring of data workflows using CI/CD best practices.
* Promote strategies to improve data modelling, quality and architecture.
* Participate in code reviews, mentor junior engineers, and contribute to team knowledge sharing.
* Document data processes, architecture, and workflows for transparency and maintainability.
* Work with big data solutions, data modelling, understand the ETL pipelines and dashboard tools.
Required Qualifications
* 4+ years relevant industry experience in a data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
* Proficiency in writing SQL queries and knowledge of cloud‑based databases like Snowflake, Redshift, BigQuery or other big data solutions.
* Experience in data modelling and tools such as dbt, ETL processes, and data warehousing.
* Experience with at least one of the programming languages Python, C++ or Java.
* Experience with version control and code review tools such as Git.
* Knowledge of latest data pipeline orchestration tools such as Airflow.
* Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure‑as‑code tools (e.g., Docker, Terraform, CloudFormation).
* Familiarity with data quality, data governance, and observability tools (e.g., Great Expectations, Monte Carlo).
* Experience with BI and data visualization tools (e.g., Looker, Tableau, Power BI).
* Experience working with product analytics solutions (Amplitude, Mixpanel).
* Experience working on mobile attribution solutions (Appsflyer, Singular).
* Experience working on a mobile game or a mobile app, ideally from early stages of the product life cycle.
* Experience working in an Agile development environment and familiar with process management tools such as JIRA, Target process, Trello or similar.
Nice to Have
* Familiarity with data security, privacy, and compliance frameworks.
* Exposure to machine learning pipelines, MLOps, or AI‑driven data products.
* Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark.
* Exposure to AI/ML concepts and collaboration with data science or AI teams.
* Experience integrating data solutions with AI/ML platforms or supporting AI‑driven analytics.
Seniority level
Mid‑Senior level
Employment type
Full‑time
Job function
Engineering and Information Technology
Industries
Computer Games
#J-18808-Ljbffr