Data Engineer
Location - Glasgow (hybrid) 3 days in a week
Contract role (6 to 12 Months)
Skills / Qualifications:
· 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
· 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines
· 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions
· 3+ years of experience in data development and solutions in highly complex data environments with large data volumes.
Experience with code versioning tools (e.g., Git)
· Knowledge of Linux operating systems
· Familiarity with REST APIs and integration techniques
· Familiarity with data visualization tools and libraries (e.g., Power BI)
· Background in database administration or performance tuning
· Familiarity with data orchestration tools, such as Apache Airflow
· Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing