Job Title: Data Engineer Location: Birmingham (Hybrid/Onsite as required) Salary/Rate: £393.75 per day (Inside IR35) Start Date: 31/07/2025 Job Type: Contract (until 31/10/2025) Company Introduction We have an exciting opportunity available with one of our leading consultancy partners working on a financial services project. They are currently seeking an experienced Data Engineer to join their growing DevOps and Data Platforms team on a contract basis. Job Responsibilities/Objectives As a Data Engineer, you will be working within a cross-functional delivery team to build, manage and support robust data pipelines and infrastructure. The role requires strong DevOps knowledge and a hands-on approach to problem-solving. 1. Design, implement, and maintain CICD pipelines and related automation tooling. 2. Develop and manage Python-based applications and APIs for data operations. 3. Build and support data engineering workflows including ETL/ELT pipelines. 4. Operate within cloud-based and containerised environments with monitoring and logging tools. 5. Contribute to infrastructure management, including secrets management and networking configuration. 6. Provide operational support for large-scale big data platforms and services. Required Skills/Experience 1. Strong DevOps engineering skills including GitHub, Jenkins, GitHub Actions, Nexus, and SonarQube. 2. Proficiency in Linux environments with scripting knowledge (Groovy, Bash, Python). 3. Solid Python development experience, particularly with Flask, Dash, Pandas, and NumPy. 4. Expertise in data engineering tools and frameworks such as Spark, Airflow, SQL/NoSQL, and Delta Lake. 5. Experience with cloud infrastructure (GCP or internal cloud), Docker, Kubernetes, and Argo CD. 6. Familiarity with secrets management tools like Vault and networking protocols (TCP/IP, SSH, TLS). Desirable Skills/Experience 1. Experience with MLFlow, Starburst, S3 buckets, and Postgres databases. 2. Working knowledge of data formats like Parquet and Avro, and optimisation practices like partitioning. 3. Familiarity with Service Mesh technologies (e.g., Istio) and monitoring stacks (e.g., ELK). 4. Exposure to managing Hadoop or Spark clusters and experience with JVM/JDK and Cloudera. If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format. Disclaimer Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies. Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.