Data Engineer - Inside IR35 - Glasgow - Private Sector - Hybrid
Day Rate - up to £675
Duration - 6 months
Harvey Nash's Client are looking to bring in a contract Data Engineer, You will be responsible for designing, building, managing, and optimizing data pipelines and the data model used by Data Scientists and Traders. Working closely with Data Management teams on governance and security as well as business stakeholders around projects and IT teams to deliver these data pipelines and models effectively into production. This role will focus on python development on the Databricks and Streamlit platforms to deliver operational tools and process efficiencies.
Other Responsibilities
Architect, create, improve, and operationalise integrated and reusable data pipelines, tools and processes. Measuring performance, efficiency and robustness of data provision and best practice process approaches.
Deliver commercial value through working with key business stakeholders, IT experts and subject-matter experts and high-quality analytics and data science solutions.
Understand that solutions and developments need to fit with the long-term Data Strategy for the analytics platform.
Explore the opportunities presented by using the latest advances in technology and tools alongside contributing to ensure an appropriate operational model exists to support solutions in production.
Promote a better understanding of data and analytics across business stakeholders.
Stakeholder Management & Communication
Possess outstanding communication skills and encourages collaboration across EM. Building strong relationships throughout the business is crucial to preserving the trust and respect necessary for the team to continue offering support and guidance.
Strong communication and stakeholder management skills, with the ability to lead cross-functional projects and drive adoption of data quality practices
Promote and champion the exceptional work being accomplished.
Skill/Experience Required
Extensive experience in programming/query language skills including Python/PySpark and SQL.
Data engineering experience - via data preparation, transformation & conversion to allow for rapid processing of system data across a number of different formats & structures
Experience in Databricks, Azure,ADF, Streamlit or alternatives.
Excellent technical computing, analysis, design and development skills to a proven professional level.
Good understanding of data streaming concepts with experience designing, analyzing(not essential) A good understanding of data modelling, ELT design patterns, data governance, and security best practices.
A problem-solving mindset, curiosity, and adaptability - able to operate as a generalist across multiple data domains
A strong background in delivering using cloud-based Microsoft Azure Data and Analytics capabilities, Microsoft DevOps and Agile delivery
Experience with designing, building, and operating analytics solutions using Azure cloud technologies
TPBN1_UKTJ