ResponsibilitiesDesign, build, and maintain ETL/ELT pipelines and batch/streaming workflows.Integrate data from external APIs and internal systems into Snowflake and downstream tools.Use web scraping / browser automation to pull data from platforms that only have UI based data extract capabilities (no APIs) Own critical parts of our Airflow-based orchestration layer and Kafka-based event streams.Ensure data quality, reliability, and observability across our pipelines and platforms.Build shared data tools and frameworks to support analytics and reporting use cases.Partner closely with analysts, product managers, and other engineers to support data-driven decisions.Key Skills3+ years of experience as a Data Engineer working on data infrastructure.Strong Python skills and hands-on experience with SQLExperience with modern orchestration tools like Airflow.Experience with APIs and extracting data from APIs.Understanding of data modelling, governance, and performance tuning in warehouse
environments.Comfort operating in a cloud-native environment like AWS.Terraform experience.Nice to haveSnowflakeWeb scraping via browser automation (playwright / selenium / puppeteer for example)
#J-18808-Ljbffr