Your responsibilities:
•Develop and maintain data pipelines using ADF for ingestion, transformation, and loading from diverse sources.
•Optimize ETL processes for performance and cost efficiency.
•Lead migration projects from Ab-Initio to modern cloud platforms (Azure + Snowflake).
•Ensure data integrity, quality, and compliance during migration.
•Write and tune SQL queries, stored procedures, and implement best practices for query performance.
•Work closely with business stakeholders, data engineers, and analysts to deliver solutions.
•Implement data governance, security, and lineage practices using Azure tools.
•Explore and integrate emerging technologies like Azure Databricks, Python, and orchestration frameworks for advanced analytics.
•Able to take key decisions with respect to technology landscape in data technologies weighing risk versus reward.
Your Profile
Essential skills/knowledge/experience:
•Cloud & Data Platforms: Azure Data Factory, Azure Synapse, Azure Data Lake, Snowflake.
•Database & Querying: SQL Server, PL/SQL, T-SQL.
•ETL Tools: Ab-Initio (migration experience), Informatica (optional).
•Programming: Python, PySpark (preferred).
•Other: CI/CD with Azure DevOps, Data Modeling (Star/Snowflake schema), Metadata management.
•String analytical, Problem-solving and communication skills.
Desirable skills/knowledge/experience:
Microsoft Certified: Azure Data Engineer Associate (DP-203).
Experience in BFSI domain or large-scale enterprise data platforms.