Required Skills
* 4+ years of hands-on experience as a Data Engineer, with at least 2+ years specifically working with Google Cloud Platform (GCP) data services.
* Strong proficiency in SQL and experience with schema design and query optimization for large datasets.
* Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning.
* Hands-on experience with at least one of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
* Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development.
* Understanding of data warehousing and data lake concepts and best practices.
* Experience with version control systems (e.g., Git).
#J-18808-Ljbffr