Job Scope: Establish run and manage data quality processes on core group finance systems and platforms. Manage and mitigate data quality risk on an ongoing basis and provide regular and clear updates to senior leaders on the level of data risk in the organisation.
Background and Purpose:
Through FFP we are looking to build a sustainable growth platform by driving more consistent performance across each of our markets through modernisation of our organisational model and technology platform.
As part of this modernisation there is a need to create a Finance centre of excellence to manage and run the Finance Data Repository (FDR) and associated tooling, technology, and business processes. Through the Future Finance Programme (FFP), this position is essential in partnering with technology and business leads to design and build the associated product Owner processes from scratch and transition them into a BAU run model.
The role purpose as follows:
- Establish and implement ongoing framework for managing, measuring, and governing data quality.
- Implement a mindset of continuous improvement in the timeliness, accuracy, completeness and comprehensiveness of data into Finance.
- Establish protocols, tooling requirements, and processes for incident management when faced with issues on data quality or batch feed delays.
- Provide a clear product roadmap for data deliveries on FDR and other systems and processes as required to all downstream finance data consumers.
- Establish a people and resource capability to manage the data platform processes.
The key deliverables:
- Work with Local Business Unit’s (LBU’s) to implement data quality rules on both the Unified Data Platform (UDP) and the Finance Data Repository (FDR).
- Define and implement a centralised data quality issue tracker to monitor issues impacting Finance and work with data owners to resolve issues at source.
- Implement a continuous improvement model to detect, understand and remediate data issues in a timely fashion, with adequate managing information to inform senior leaders of the level of Data Quality Risk currently faced by Prudential Finance.
- Implement consistency checks are implemented across data sets to ensure a consistency set in data used across Finance.
- Define and implement an operating model for managing, governing and implementing adjustments, overlays and overrides of data to ensure it is transparent and efficient.
- Implement a controlled mechanism and framework for sourcing and controlling manual data inputs into FDR.
- Provide updates on the product roadmap, data quality and incidents to the Data Governance Council.
- Oversee and review the testing strategy of FDR and associated applications.
- Partner with technology to ensure all technical feed controls are in place and that there is a clear playbook and toolset to execute when faced with unanticipated incidents with upstream data feeds of FDR processing.
- Establish non functional requirements and performance needs of FDR and monitor those targets in both release management and on an ongoing basis in production.
- Establish SLA’s for onboarding or changing data on the FDR platform, and partner with technology to develop configurable tooling to onboard data quickly and within SLA’s.
- Establish prioritisation process for onboarding new data into FDR to meet downstream needs.
- Establish a Data COE team to run and oversee all operational controls and processes. The processes and controls should be documented in Ops Risk framework, have clear procedures and meet all necessary operational considerations such as disaster recovery and location strategy.
- Work with Finance head of data governance to implement a security and entitlements model for securing and accessing Finance data.
Principal Accountabilities:
- Establish data quality rules engine, data consistency checks, data quality issue tracking on FDR and within LBU’s.
- Implement process for continuous improvement with data quality.
- Develop Finance Data security and entitlements model.
- Establish playbook/framework and processes for manual data sourcing and working around issues i.e. adjustments, overrides, defaults and overlays.
- Establish governance for prioritisation of data sourcing, publication of product roadmap, tracking MI on data issues, and raising awareness on data risks.
- Working with technology ensure test strategy for FDR is robust and agreed with data consumers.
- Establish SLA’s and performance requirements with data consumers and monitor them in production. Agree playbook and protocols for incident management.
Candidate Requirements:
- 10-15 years of relevant industry experience in relevant data roles in a financial services organisation (preferably insurance industry).
- 2-3 years experience in either a Product Owner or Data Quality Lead role.
- Qualified or experienced product owner.
- Working knowledge of Collibra/Purview or another Data Dictionary platform.
- DCAM Certification or extensive Data Governance Experience in another Financial Institution.
- Operating model design – Proven track record in designing and implementing operating model design.
- Extensive experience in data quality monitoring and data analysis.
- Has extensive experience working with Audit, Compliance and Operational Risk.
- Has experience working with and managing data lakes preferably on databricks or Azure platforms.