Must have Skills : Python (Strong), Databricks, Apache Kafka, Airflow/DBT,
Good To Have Skills : Kubernetes (Strong),
Job Responsibility:
* Technical Leadership: Take in charge of the design, development and delivery of cyber data platform solutions and provide technical guidance and mentorship across all teams of cyber data platform.
* Data Architecture: Design, build and management of real time, near real time and batch data architectures that support threat detection, incident response and reporting through advanced analytics, machine learning and GenAI capabilities.
* Coding: Contribute to raising the quality bar of the team’s codebase by producing high-quality code, conducting thorough peer reviews, and proactively providing constructive feedback to other team members on their code
* Data Security and Compliance: Implement and enforce data security standard methodologies, ensuring the protection of sensitive information and compliance with relevant regulations.
* Automation and DevOps: Implement various automations and DevOps practices to streamline the deployment, configuration, and management of data platform components.
Experience Required:
* You will need Proven experience in leading the teams to deliver resilient systems and providing technical guidance for optimal product solutions.
* Experience building data platforms on cloud services like Databricks on Azure or GCP BigQuery, with the goal of delivering a self-service data platform for users.
* Proficient in Python and SQL, with a solid grasp of architectural patterns, coding standards, code reviews, version control, and CI/CD practices.
* Expertise in ETL and ELT frameworks for large scale realtime and batch data processing, with hands-on experience in Kafka, Flink, Airflow and dbt as well as containerisation technologies like Docker and Kubernetes.
* Ability to provide clear input, guide, opportunities to help engineers develop and advance. Knowledge of cybersecurity principles and practices.