To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure.
Candidates working for this role will either contribute to strategic deliverables by building data engineering pipelines using the AWS technology stack, or support platform modernization efforts by migrating existing AWS components to Databricks, depending on their assessed skill sets.
UI development experience as must along with AWS Cloudformation.
Should lead the project:
* Ability to design, implement and optimise complex data pipelines independently.
* Strong SQL and programming (Python/Pyspark preferred); capable of building reusable and scalable code.
* Experience with modern orchestration tools (e.g. Airflow, Step Functions, etc.).
* Hands‑on experience on AWS cloud platform with skillsets listed below.
* Ability to proactively analyse existing processes, suggest improvements, and drive technical solutions end‑to‑end.
* Comfortable engaging with stakeholders, understanding business requirements, and translating them into technical design.
* Strong Experience in Databricks development/migration. (this is for Databricks engineer)
* Experience with Apache Spark, Databricks Delta Lake, Unity Catalog, and MLflow. (this is for Databricks engineer)
Accountabilities
* Build and maintenance of data pipelines that enable the transfer and processing of durable, complete and consistent data.
* Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures.
* Development of processing and analysis algorithms fit for the intended data complexity and volumes.
* Collaboration with data scientist to build and deploy machine learning models.
Hands‑on Skills
Data Engineering Coding background and Hands on experience on the following AWS services: S3, Lambda, Glue, Step Function, Athena, Sagemaker, VPC, ECS, IAM, KMS etc.
* CloudFormation
* Python
* Unit Testing
* Gitlab
* PySpark
* AI/ML knowledge (Good to have)
Seniority level
* Mid‑Senior level
Employment type
* Contract
Job function
* Accounting/Auditing
Industries
* IT Services and IT Consulting
#J-18808-Ljbffr