Role Overview;
One of the worlds largest banks are looking for a Enterprise Data Engineer for their Global Cybersecurity Data Acquisition and Engineering team, and will be responsible for all aspects of data acquisition and on-boarding to the cyber data lake and analytics platform. The team partners with IT Infrastructure teams in end-to-end deployment of security technologies across the firm and delivers normalised security data feeds to the Databricks cyber analytics platform.
The role involves the design and implementation of data ingestion patterns and pipelines to make feeds available for security analytics. It requires sustainable, reliable and performant solutions to be implemented on a repeatable basis.
The role requires an experienced systems engineer with strong technical leadership and collaboration skills. The ideal candidate will have significant experience in data pipeline and analytics technologies, Linux system administration and cloud infrastructure (e.g. Databricks, Data Factory, NiFi, Kafka, RHEL/Ubuntu, Azure Function Apps, Azure platform services, etc.).
Experience;
* Experience with Cloud-based Data Pipelines (Preferably Azure)
* Expertise in one of more key areas of Google Site Reliability Engineering (SRE) principles for large-scale cloud infrastructure, with a commitment to continuous learning and professional development.
* Broad knowledge of Azure services (Identity, Networking, Compute, Storage, Web, Containers, Databases, PowerBI).
* Proficiency with Infrastructure-as-Code and automation tools (Terraform, Chef, Ansible, CloudFormation/ARM).
* Familiarity with streaming platforms (Azure Event Hubs, Kafka) and stream processing (Spark streaming).