Job Description
As part of the Licensing team, the Senior Data Engineer plays a pivotal role in shaping our data infrastructure and driving actionable insights from Ansys application usage telemetry data using Databricks. Your expertise will be critical in designing, implementing, and optimizing data pipelines that transform raw data into valuable information used to make decisions about where to improve and streamline our applications.
RESPONSIBILITIES
* Designs, develops, and implements scalable and high-performance data solutions on the Azure platform.
* Optimizes data storage and retrieval mechanisms to ensure the efficient processing of large datasets.
* Develop, optimize, and manage ETL/ELT data pipelines using Databricks, with a focus on data integrity and quality.
* Focus on optimization of our Data Warehouse, identifying opportunities to reduce complexity and cost.
* Investigate problems discovered by QA or product support and develop solutions.
* Works under the general supervision of a development manager.
* Participates in planning, architecture, and research.
* Collaborate with cross-functional teams including data scientists, engineers, and analysts to translate business requirements into scalable solutions.
MINIMUM QUALIFICATIONS
* BS in Engineering, Computer Science, or related field with 5 years’ experience, MS with 3 years’ experience, or PhD with 1 year experience
* Strong background in software development in a commercial environment, with expertise in API development and integration.
* Proficiency in Python and PySpark.
* Prior experience implementing ETL/ELT solutions using Databricks.
* Deep understanding of data structures, data schemas and data handling algorithms.
* Effective verbal and written communication skills.
* Ability to learn quickly and to collaborate with others in a geographically distributed team.
PREFERRED QUALIFICATIONS
* Strong experience in data engineering and architecture, with a focus on Azure cloud technologies.
* Experience with Databricks Platform administration, cluster sizing and optimization, and CI/CD lifecycles.
* Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT).
* Experience with Power BI/Tableau/QuickSight.
* Good communication and interpersonal skills.
* Ability to learn quickly and to collaborate with others in a geographically distributed team.