Methods Analytics (MA) is recruiting for a Data Engineer to join our team a permanent basis.
This role will be mainly remote but require flexibility to travel to Bristol.
What You'll Be Doing as a Data Engineer:
* Design and architect modern data solutions that align with business objectives and technical requirements
* Design and implement advanced ETL/ELT pipelines using Python, SQL, and Apache Airflow
* Build highly scalable and performant data solutions leveraging cloud platforms and technologies
* Develop complex data models to handle enterprise-level analytical needs
* Make critical technical decisions on tools, frameworks, and approaches for complex data challenges
* Optimise large-scale data processing systems for performance and cost-efficiency
* Implement robust data quality frameworks and monitoring solutions
* Evaluate new technologies to enhance our data engineering capabilities
* Collaborate with stakeholders to translate business requirements into technical specifications
* Present technical solutions to leadership and non-technical stakeholders
* Contribute to the development of the Methods Analytics Engineering Practice by participating in our internal community of practice
Requirements
* Experience in SQL Server Integration Services (SSIS)
* Good experience with ETL - SSIS, SSRS, T-SQL (On-prem/Cloud)
* Strong proficiency in SQL and Python for handling complex data problems
* Hands-on experience with Apache Spark (PySpark or Spark SQL)
* Experience with the Azure data stack
* Knowledge of workflow orchestration tools like Apache Airflow
* Experience with containerisation technologies like Docker
* Proficiency in dimensional modelling techniques
* Experience with CI/CD pipelines for data solutions
* Experience implementing and advocating for test-driven development methodologies in data pipeline workflows, including unit testing, integration testing, and data quality validation frameworks
* Strong communication skills for translating complex technical concepts
* Track record of successful project delivery in a technical leadership capacity
You may also have some of the desirable skills and experience:
* Experience designing and implementing data mesh or data fabric architectures
* Knowledge of cost optimisation strategies for cloud data platforms
* Experience with data quality frameworks and implementation
* Understanding of data lineage and metadata management
* Experience with technical project management
* Experience with data visualisation tools like Power BI or Apache Superset
* Experience with other cloud data platforms like AWS, GCP or Oracle
* Experience with modern unified data platforms like Databricks or Microsoft Fabric
* Experience with Kubernetes for container orchestration
* Understanding of streaming technologies (Apache Kafka, event-based architectures)
* Software engineering background with SOLID principles understanding
* Experience with high-performance, large-scale data systems
* Knowledge of recent innovations in AI/ML and GenAI
* Defence or Public Sector experience
* Consultant experience
Security Clearance:
UKSV (United Kingdom Security Vetting) clearance is required for this role, with Security Check (SC) as the minimum standard.
Our Hiring Process
At Methods Analytics, we believe in a transparent hiring process. Here's what you can expect:
1. Internal Application Review
2. Initial Phone Screen
3. Technical Interview
4. Pair Programming Exercise
5. Final Interview
6. Offer