Overview
Data Engineer (AWS) – Hybrid, 1-3 days/week onsite in Skelmersdale, England. Employment Type: 6 month contract with option for permanent placement.
About The Company
A leading UK-based online pharmacy, processing high volumes of prescriptions monthly, is dedicated to making healthcare accessible and efficient through innovative technology. The company leverages cutting-edge solutions to streamline processes and empower its team to deliver seamless healthcare services.
Role Overview
As a Data Engineer, you will manage and enhance a modern data platform, transforming raw data from pharmacy operations, e-commerce, and marketing into reliable, query-ready assets. You will build scalable pipelines using AWS services, ensure data quality, and collaborate with analysts to drive data-informed decisions.
Key Responsibilities
* Data Pipelines: Design and maintain modular, scalable pipelines using AWS (Lambda, Glue, Athena, S3) for data ingestion, transformation, and storage.
* Python Development: Utilize Python and AWS CDK for infrastructure and Lambda for processing; apply Pandas for efficient data workflows.
* Data Quality: Implement validation, anomaly detection, and GDPR-compliant governance.
* Performance Optimization: Tune pipelines for cost and performance; optimize data models (e.g., Parquet, partitioning).
* Automation: Manage infrastructure as code; automate CI/CD for deployments and schema changes.
* Collaboration: Work with stakeholders to create data products and guide BI best practices.
Qualifications
* Degree in Computer Science, Data Engineering, or equivalent experience.
* 4+ years in data engineering with a focus on analytics and scalability.
* Expertise in AWS data services (Glue, Athena, S3, Lambda, Step Functions).
* Strong Python skills and experience with production-grade ETL pipelines.
* Knowledge of data modeling for analytics (e.g., partitioning, denormalization).
* AWS certification (e.g., Data Engineer) is a plus.
Must-Have Technical Skills
* AWS Data Services (Lambda, Glue, Athena, S3).
* Python (Pandas, AWS CDK).
* Production-grade ETL pipeline development.
* Data modeling (partitioning, denormalization, Parquet/ORC).
* Data quality and GDPR compliance.
Nice-to-Have Technical Skills
* AWS Certification (Data Engineer or similar).
* CI/CD automation for data workflows.
* Cost optimization for pipelines and queries.
#J-18808-Ljbffr