Skip to main content

Data quality – Why it matters and how financial services firms can best achieve it

(Image credit: Image Credit: Pitney Bowes Software)

Too many financial services organisations are still failing to implement effective data quality and risk management policies today. Part of the problem is that they typically adopt too much of a reactive approach. Their overriding focus and priority is on validating and cleansing the data that comes into their organisation before they subsequently distribute it more widely. They are motivated by a desire to prevent downstream systems receiving erroneous data. That’s important, of course - but by focusing on ad-hoc incident resolution in this way, organisations almost inevitably struggle to identify and address recurring data quality problems in a structural manner.

To remedy this problem, firms need to be able to more continuously carry out analysis, targeted at understanding their data quality and reporting on it over time. Very few organisations across the industry are doing this today and that’s a significant issue. However, much data cleansing an organisation does, after all, if it fails to track what was done in the past, it will not know how often specific data items contained gaps, completeness or accuracy issues, nor appreciate where those issues occur most frequently.

Focusing their data quality efforts exclusively on day-to-day data cleansing is also likely to result in organisations struggling to understand how frequently data quality mistakes are made, or how often quick bulk validations replace more thorough analysis.  For many, their focus on day-to-day data cleansing disguises the fact that they don’t have a clear understanding of data quality, let alone how to measure it or to put in place a more overarching data quality policy.

When firefighting comes at the expense of properly understanding underlying quality drivers that’s a serious issue.  After all, in an industry where regulation on due process and fit-for-purpose data has grown increasingly prescriptive, the risks of failing to implement a data quality policy and data risk management processes can be far-reaching.

Implementing a framework

To tackle this effectively, organisations need to put in place a data quality framework. Indeed, the latest regulations and guidelines across the financial services sector from Solvency II to FRTB increasingly require them to establish and implement this.

That means identifying what the critical data elements are, what the risks and likely errors or gaps in that data are, and what controls and flows are in place. Very few organisations have implemented such a framework so far. They may have previously put stringent IT controls in place, but these have typically focused more on processes than data quality itself.

By using a data quality framework, firms can sketch out a policy that establishes a clear definition of data quality and what the objectives of the approach are. Such a plan could also document the data governance approach, including not just processes and procedures but also responsibilities and data ownership.

The framework could also help organisations establish the dimensions of data quality: that data should be accurate, complete, timely and appropriate, for example. For all these areas, key performance indicators (KPIs) need to be put in place to allow the organisation to measure what data quality means in each case. Key risk indicators (KRIs) also need to be implemented and monitored to ensure the organisation knows where its risks are and that it has effective controls to deal with them. KPIs and KRIs should be shared with all stakeholders for periodic evaluation.

Scoping out the role of data quality intelligence

A data quality framework will inevitably be focused on the operational aspects of an organisation’s data quality efforts. To take data quality to the next level, businesses can employ a data quality intelligence approach which gives them the ability to achieve a much broader level of insight, analysis, reporting and alerts.

This will in turn allow the organisation to capture and store historical information about data quality, including how often an item was modified and how often data was erroneously flagged – all good indicators of the level of errors as well as the quality of the validation rules. More broadly, it will enable critical analysis capabilities for these exceptions and also for the effectiveness of key data controls and reporting capabilities for data quality KPIs, vendor and internal data source performance, control effectiveness and SLAs. 

In short, data quality intelligence effectively forms a layer on top of the operational data quality functionality provided by the framework, which helps to visualise what that framework has achieved, helping to ensure that data controls are effective, and that the organisation is achieving its KPIs and KRIs. Rather than being an operational tool, it is a business intelligence solution, providing key insight into how the organisation is performing against its key data quality goals and targets. CEOs and chief risk officers (CROs) could benefit from this functionality as could compliance and operational risk departments.

While the data quality framework helps channel the operational aspects of an organisation’s data quality efforts, data quality intelligence provides key decision-makers and other stakeholders with an insight into that approach, helping them measure its success and demonstrate the organisation is compliant with its own data quality policies and with relevant industry regulations. 

Ultimately, there are many benefits of the approach. It improves data quality in general of course. Beyond that, it helps financial services organisations demonstrate the accuracy, completeness and timeliness of their data, which in turn helps them meet relevant regulatory requirements, and assess compliance with their own data quality objectives. There is clearly no time like the present for firms across the sector to start getting their data quality processes up to speed.

Bayke Baboelal, Director, Data Services, Asset Control
Image Credit: Pitney Bowes Software