Your responsibilities:
•Writes ETL (Extract / Transform / Load) processes, designs database systems, and develops tools for real-time and offline analytic processing.
•Troubleshoots software and processes for data consistency and integrity. Integrates large scale data from a variety of sources for business partners to generate insight and make decisions.
•Translate business specifications into design specifications and code. Responsible for writing complex programs, ad hoc queries, and reports. Ensures that all code is well structured, includes sufficient documentation, and is easy to maintain and reuse.
•Partners with internal clients to gain an enhanced understanding of business functions and informational needs. Gains expertise in tools, technologies, and applications/databases in specific business areas and company-wide systems.
•Leads all phases of solution development. Explains technical considerations at related meetings, including those with internal clients and less experienced team members.
•Tests code thoroughly for accuracy of intended purpose. Reviews end product with the client to ensure adequate understanding. Provides data analysis guidance as required.
•Designs and conducts training sessions on tools and data sources used by the team and self provisioners. Provides job aids to team members and business users.
•Tests and implements new software releases through regression testing. Identifies issues and engages with vendors to resolve and elevate software into production.
•Participates in special projects and performs other duties as assigned.
Your Profile
Essential skills/knowledge/experience:
•Proficiency in designing, development, and maintenance of robust ETL pipelines for data ingestion and transformation.
•Cloud platform expertise: Good exposure with AWS services for data storage, processing, and orchestration.
•Data modeling and architecture: Design scalable data models and ensure data integrity across systems.
•Programming proficiency: Advanced skills in Python, for data processing and automation.
•Data quality and governance: Implement best practices for data validation, lineage, and compliance with regulatory standards.
•Minimum of five years data analytics, programming, database administration, or data management experience. Undergraduate degree or equivalent combination of training and experience
Desirable skills/knowledge/experience:
•Advanced knowledge of data engineering principles, including data warehousing and data lakes.
•Proficiency in AWS cloud services (e.g., S3, Redshift, Glue, EMR, Lambda) for data storage, processing, and orchestration.
•Exposure to machine learning pipelines and integration with data engineering workflows.