Role : Data Engineer
Location : Glasgow ( 3 days in a week)
Key Requirements/Expertise
* Primary skills : Data, Python, pyspark
* Should have expertise in data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS, Databricks etc.
* Should have expertise in writing SQL queries against any RDBMS with query optimization.
* Experience structuring Data Lake for reliability security and performance.
* Experience implementing ETL for Data Warehouse and Business intelligence solutions.
* Ability to write effective modular dynamic and robust code, establish code standards as well.
* Strong analytical problem-solving and troubleshooting abilities.
* Good understanding of unit testing software change management and software release management.
* Knowledge of DevOps processes including CI/CD and Infrastructure as Code fundamentals
* Experience performing root cause analysis on data & processes and identify opportunities for improvement.
* Familiar with Agile software development methodologies.
* Proactive communication and stakeholder management.
#J-18808-Ljbffr