Title: Data Analyst Experience: 5years Work mode: Hybrid, 2 days WFO Location: London Contract duration: 6months The team/project The RCMS project has been set up to help support the front line business units om the Salesforce platform. We are aiming to simplify the data and technology in order to have a more robust platform that is cheaper to support and develop. The data workstream is responsible for remediating one of our core data models, improving data accuracy, reducing data-related errors, and standardising data structures to enable faster, lower-risk delivery and support the client’s data-driven decision making and strategic objectives. What you will be doing (the role) • You will join a data remediation team tasked with simplifying the underlying data model for a heavily customised Salesforce implementation • Attending workshops with various technical and non-technical stakeholders to understand the current landscape, including requirements, drivers, and impacts • Collaborating with Business Analysts and Architects to ensure business needs align with target data models, lineage and documented metadata • Reverse-engineering the current Salesforce object/field landscape, usage patterns, and data flows; identifying duplication, anti-patterns, and complexities • Defining a target data model aligned to business processes, produce gap-analysis from the as-is to to-be with a consideration for business impact • Establish and inform data lineage, data dictionaries, taxonomies and reference/master data What you will get from the role • We are looking for Data Analysts with experience of complex problem analysis; requirements capture and refinement • You will be looking for an opportunity to work with senior business stakeholders to understand, scope and interpret strategic business change into data concepts which can then shape the broader data landscape • You will have the chance to materially improve the reliability and usability of a business-critical Salesforce platform and it’s upstream/downstream consumers • You will be a key contributor to the data remediation effort. You will make a substantial and valued contribution to the client in a role that offers great challenges in the immediate future and as the client develops The skills and experience you will have Minimum Criteria • Proficiency in data modelling techniques (3NF star/snowflake, Data Vault 2.0), using modern tools (erwin, PowerDesigner, Sparx EA) and version control • Experience creating and maintaining data dictionaries, taxonomies, and reference/master data, with governance for ongoing maintenance • Experience working with internal customers to define data-relevant use cases and success criteria • Strong communication and stakeholder engagement skills; able to communicate with technical and non-technical audiences Desirable Criteria • Exposure to relational and non-relational data stores; data warehouse/lakehouse concepts; ETL/ELT patterns and tools • Experience in business analysis and requirement elicitation for data-centric changes; translating business needs into data models • Experience with documenting/implementing data lineage (e.g., via Solidatus) and impact analysis • Knowledge of data architecture best practices • Proven ability to document the current-state ("as-is") and define the future state ("to-be") processes and perform gap analysis • Exposure to data governance frameworks (e.g., DAMA DMBOK) and practical rules (e.g., data quality rules) • Familiarity with privacy/security-by-design concepts (handling PII, data minimisation, RBAC)