Role: Data Engineer (Python, Databricks, Snowflake, ETL)
Location: Glasgow, UK (3days/week On-Site)
Job Type: Contract
Skills / Qualifications:
* 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
* 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines
* 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions
* 3+ years of experience in data development and solutions in highly complex data environments with large data volumes.
* Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices-Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment.
* Experience with code versioning tools (e.g., Git)
* Knowledge of Linux operating systems
* Familiarity with REST APIs and integration techniques
* Familiarity with data visualization tools and libraries (e.g., Power BI)
* Background in database administration or performance tuning
* Familiarity with data orchestration tools, such as Apache Airflow
* Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing
* Strong analytical skills, including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions.
* Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions, roles and able to interact effectively with all levels.
* Self-starter. Proven ability to manage multiple, concurrent projects with minimal supervision. Can manage a complex ever changing priority list and resolve conflicts to competing priorities.
* Strong problem-solving skills. Ability to identify where focus is needed and bring clarity to business objectives, requirements, and priorities.