We are seeking a Senior Data Engineer for our Oil & Gas Operator client based in Aberdeen.
This is a STAFF role working as part of a Data Team where you will be focussed on process and looking for improvements.
As a candidates you will be currently working as a Data Engineer and ideally with 5 years or more experience and have a strong working knowledge of PySpark, Azure Data Bricks, Azure Data Factory and Azure Data Lake Storage.
Ideally you will be based in Aberdeen or within a commutable distance of Aberdeen as the role cannot be worked remotely.
Proven competency working with data warehousing, ETL/ELT, integration tools and business intelligence solutions that will help to deliver the data and analytics strategy.
Skills with Synapse (notebooks and data flows) is essential for Senior Data Engineer.
Working knowledge of PySpark is essential for Senior Data Engineer
Capability working with Azure Data Lake Storag e is essential for Senior Data Engineer
Working knowledge of medallion data lakehouse architecture is essential for Senior Data Engineer
Strong background in data analytics, with a focus on data transformation and modelling.
Competency of Master Data Management principles and implementations is essential for Senior Data Engineer
Familiarity with Power BI is desirable but not essential.
Provision of support across all Development Projects within the Company's portfolio.
Supporting Data Platform Program: Collaborate with cross-functional teams to maintain and enhance our modern data platform, leveraging your expertise in Synapse and data engineering techniques.
Understand best practice of data engineering and its application, and stay up to date with emerging technologies in the data space
Analyse, Model and Organise Data: Work with a range of stakeholders and business users to understand the use and utility of datasets and systems to then analyse, model and organise data from their respective source data systems into the medallion data lake for further use in reporting.
Ensure data quality and data reliability: Drive improvements in data quality assessments, and ensure that data is processed effectively, efficiently, robustly and timely. Implementing data validation and cleansing processes to improve data management.
Maintaining Data Governance: Ensuring that data governance policies and procedures are followed, and that data lineage and cataloguing is maintained for data discoverability
Bringing New Data Projects to Life: Take the lead in initiating, designing, and executing data projects, ensuring their entire lifecycle is managed effectively.
Performance Monitoring: Optimise and tune pipelines and data processing to increase and improve performance and efficiency.
Performance Management: Looking at wider trends across the data processing infrastructure to identify improvements. Establishing and implementing monitoring and logging solutions across the data platform to improve visibility and management of the data platform
Project Scoping and Management: Defining project scopes and timelines for delivery of a variety of projects, in collaboration with the Digital Technology Partners and the Data and Analytics Lead.
Data Governance: Working with the Data and Analytics Lead to improve Data Governance policies and procedures and its enforcement across the data estate