I am hiring for AWS Data Engineer
Location: Glasgow 2–3 days per weekly Onsite
Job Description
We are looking for an experienced AWS Data Engineer with strong hands‑on coding skills and expertise in designing scalable cloud‑based data solutions. The ideal candidate will be proficient in Python, PySpark, and core AWS services, with a strong background in building robust data pipelines and cloud‑native architectures.
Key Responsibilities
* Design, develop, and maintain scalable data pipelines and ETL workflows using AWS services.
* Implement data processing solutions using PySpark and AWS Glue.
* Build and manage infrastructure as code using CloudFormation.
* Develop serverless applications using Lambda, Step Functions, and S3.
* Perform data querying and analysis using Athena.
* Support Data Scientists in model operationalization using SageMaker.
* Ensure secure data handling using IAM, KMS, and VPC configurations.
* Containerize applications using ECS.
* Write clean, testable Python code with strong unit testing practices.
* Use GitLab for version control and CI/CD.
Key Skills
Python, PySpark, S3, Lambda, Glue, Step Functions, Athena, SageMaker, VPC, ECS, IAM, KMS, CloudFormation, GitLab
Seniority level
Mid‑Senior level
Employment type
Contract
Job function
Information Technology and Other
Industries
IT Services and IT Consulting, Banking, and Financial Services
Referrals increase your chances of interviewing at GIOS Technology by 2x
#J-18808-Ljbffr