Location: Bristol - 3 days on-site required
Job Type: Permanent
Salary: Competitive + bonus + benefits
About the Role
We are looking for a skilled and motivated Data Engineer to join a dynamic Data Engineering team. In this role, you will collaborate with a team of data engineers to design, develop, test, and maintain data products and our enterprise data analytics platform.
This is an exciting opportunity to join at a pivotal stage to modernise their data platform and enhance analytics and AI capabilities. You will play a key role in building scalable data architecture and pipelines that power future innovation and business success.
If you are passionate about creating modern, scalable data solutions and want to help shape how data drives decision-making, we’d love to hear from you.
Key Responsibilities
* Build, test, and maintain scalable data architectures and pipelines aligned with business and architectural standards
* Collaborate with fellow data engineers to deliver high-quality solutions and share knowledge
* Promote continuous improvement and identify opportunities for automation
* Troubleshoot complex data issues and optimise systems for performance and reliability
* Implement data validation, monitoring, and quality controls to ensure data integrity and governance compliance
* Work with cross-functional teams to define requirements and deliver business-aligned solutions
* Communicate complex technical concepts clearly to non-technical stakeholders
* Research emerging technologies and deliver proof-of-concept innovations
* Participate in Agile ceremonies, code reviews, and technical governance processes
Skills & Experience Required
* Proven practical experience working as a Data Engineer
* Proficient in SQL, Python, Spark (PySpark)
* Experience developing pipelines for structured and semi-structured data ingestion and transformation
* Understanding of data modelling and pipeline design patterns
* Experience with low/no-code pipeline tools (e.g., Talend or SnapLogic)
* Experience developing data pipelines using cloud services (AWS preferred) like Lambda, S3, Redshift, Glue, Athena, Secrets Manager or equivalent services.
* Experience integrating APIs and working with cloud SDKs/CLIs (e.g., boto3)
* Experience building data warehouses (Redshift, Snowflake, or Databricks)
* Git source control experience (Azure DevOps or similar)
* Agile/Scrum delivery experience
* Strong analytical and problem-solving skills
* Excellent communication skills with the ability to explain technical concepts to non-technical audiences
Nice to Have
* AWS certifications
* Experience using dbt for data modelling
* Infrastructure as Code experience (Terraform)
#J-18808-Ljbffr