Salary: £42,000 - 52,800 per year Requirements: Strong T-SQL for data validation and analysis. Experience with PySpark or Python for data testing. Background in testing data warehouses, data lakes or analytics platforms. Solid understanding of ETL/ELT concepts and SCDs. Experience validating data from ERP systems (e.g., finance, supply chain, manufacturing). Exposure to Azure data tools (Synapse, Data Factory, Fabric, Databricks). Familiarity with Microsoft Fabric Lakehouse (Desirable) Strong understanding of data quality dimensions. Responsibilities: Validate data ingested from ERP, customer service and operational platforms. Design and run data quality, reconciliation and source to target tests. Test ELT pipelines built in Fabric Notebooks using PySpark/Python. Validate incremental loads, full refreshes, error handling and pipeline performance. Identify, document and track defects through to resolution. Support regression testing and UAT with business users. Translate business and technical specifications into clear test cases. Contribute to data quality rules, standards and testing templates. Provide documentation to support audit and compliance activities. Technologies: Azure Data Warehouse Databricks ETL ERP Fabric Support Python PySpark SQL Cloud More: We are seeking a Data Warehouse Test Analyst for a 6-month contract located in Brierley Hill, with a hybrid work model requiring 2-3 days onsite per month. In this role, you will play a key part in validating data from various ERP and operational systems, ensuring that business rules and transformations are correctly applied. We offer the opportunity to work within a supportive team and contribute significantly to our data validation processes. last updated 7 week of 2026