Key Responsibilities Designing, developing, testing, and deploying scalable, high performance software solutions for data processing and analytics. Collaborating with business stakeholders to understand market and trading requirements and translate them into robust technical solutions. Providing technical leadership, code reviews, and mentorship to junior engineers. Troubleshooting, diagnosing, and resolving operational or performance issues in production systems. Engaging with other squads, product managers, data scientists, and technical stakeholders to deliver integrated solutions. Applying engineering best practices including TDD/BDD, CI/CD automation, and high quality documentation. Contributing to architectural decisions and evaluating technology options. Supporting continuous improvement of software engineering and data engineering standards within the team. Qualifications Bachelor's degree or higher in computer science, engineering, or a related field. At least 7 years of professional experience in software development, ideally within financial services or energy trading. Key Skills Domain & Soft Skills Experience with front office trading systems and financial market data. Experience working with large scale data processing and analytics workloads. Strong track record of mentoring and coaching junior engineers. Ability to design innovative solutions while engaging closely with business stakeholders. Strong technical leadership and collaboration skills. Excellent communication and interpersonal skills; able to thrive in a diverse, fast paced, and dynamic environment. Key Skills Technical Core Skills Must Have Python 3.9: Design patterns, separation of concerns, OOP fundamentals, logical data modelling, pandas, SQLAlchemy/psycopg2, Poetry/setuptools, awareness of new PEP features. Python test automation: TDD, BDD; unit, integration, and end to end testing. DevOps: Git, CI/CD, Azure DevOps pipelines (or Jenkins/Groovy), Bash, Docker, Artifactory/PyPI. SQL: SQL Server/T SQL; PostgreSQL/PG SQL; tuning, joins, aggregation, stored procedures, transactions. Cloud Environments: AWS/Azure principles and technologies. Technical Skills Should Have Kubernetes (EKS/AKS/OpenShift) including HA, kubectl/oc, operators, deployment configs, custom resources, Helm charts, StatefulSets. Python 3.9: parquet/arrow experience. Technical Skills Nice to Have Airflow: schedulers, executors, operators, XCom, DAG execution at scale. Databricks: Hive/Unity Catalog, optimisation, PySpark, Databricks Connect. Log Analytics: Splunk, Elastic Stack, Grafana/LOKI/Prometheus. SSO: Kerberos, Azure AD / Entra ID. Recent Java experience: Java 17, Spring Boot 3.x, Maven. Dataiku (or similar) awareness of platform concepts; experience is a bonus.