* StreamSet
* Python Programming
* Leadership and Team Handling
* Strong Communication and Collaboration Skills
Responsibilities
* Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS.
* Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets.
* Collaborate with data analysts, scientists, and other stakeholders to define and fulfill data requirements.
* Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability.
* Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake.
* Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity.
* Stay up-to-date with the latest trends and best practices in data engineering and cloud technologies, including AWS.
Qualifications
* Bachelor’s degree in computer science, Engineering, or a related field.
* 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS.
* Proficiency in SQL, Python, and ETL tools (Streamsets, DBT, etc.).
* Hands-on experience with Oracle RDBMS.
* Data migration experience to Snowflake.
* Experience with AWS services such as S3, Lambda, Redshift, and Glue.
* Strong understanding of data warehousing concepts and data modeling.
* Excellent problem-solving and communication skills, with a focus on delivering high-quality solutions.
* Understanding/hands-on experience in orchestration solutions such as Airflow.
* Deep knowledge of key non-functional requirements such as availability, scalability, operability, and maintainability.
#J-18808-Ljbffr