Build a stronghold to defend against data quality catastrophe

The cost of natural disaster recovery? Millions. The annual cost of poor data quality? Roughly $13.3 million (£8.4 million), according to research industry firm, Gartner, Inc.

Analysts calculated this number based on a year-long study released in 2014. Despite the significant impact that data can have on business, the report found that only a quarter of the companies surveyed enforced data quality standards.

And only 18 per cent of organisations use formal metrics to measure data quality. Companies are courting financial disaster that could register a devastating 10 on the Richter scale.

The debate over quality and cost in business transactions is hardly newsworthy. However, in light of the aforementioned numbers, it is surprising to note that the C-suite is engaging in this very debate as it concerns corporate data.

Keep in mind, this is the very same data that regulators demand meet strict compliance requirements, that the government checks against stringent legal standards and that organisations depend on to make smart financial decisions. It is vital today – affecting everything from marketing efforts to risk management – and it can make or break a business. Yet some are still leaving data quality up to chance.

And its momentum shows no signs of slowing down. Market analysis firm IDC reports that data is doubling in size every two years. “Bad data” is on track to becoming a multi-million dollar liability.

It is like purchasing a home in California and refusing to obtain earthquake insurance; it is only a matter of time before disaster strikes and you are left with staggering recovery costs.

Construct a fortress

The best way to defend against disaster is to lay a sturdy foundation. Data controls provide a stronghold, but there are multiple ways to implement and execute a controls platform that will banish “bad data.” When considering theirs, companies must decide between cost, quality, time and resources.

The first option is to build custom controls from scratch. The primary benefit of a custom-built solution is obvious: it’s designed to address an organisation’s most pressing business needs. This is certainly an attractive advantage, but it presents major challenges as well.

Cost is the main issue. The cost of a custom data controls solution is quite high overall. It’s expensive to develop because it requires a wide range of expertise, and maintenance becomes expensive when developers who created the original application move on to other projects or jobs.

Lack of insight on developing controls in-house is another significant issue. Deep knowledge of data controls is highly specialised, and IT leaders often develop their skillsets in other areas. Without the guidance of those who specialise in data controls solutions, critical pieces can fall through the cracks, causing wasted time or complete failure.

[full_width_ad]

The second option is to purchase data control software to implement within your organisational infrastructure. In exchange for less direct oversight, purchasing pre-packaged controls software provides companies with the advantages of fast deployment and lower development costs. It also spares companies the need to invest time or money in contracting people who specialise in this technology.

Like anything else, pre-packaged solutions have drawbacks. Companies often choose this option in order to support one or more business processes. However, data quality is rarely a one-size-fits-all approach, and software will fail as a long term solution if it cannot adapt to meet changing needs. This is an issue with pre-packaged solutions, as many require substantial resources to customise or expand the controls beyond their limited scope.

The best of both worlds

Think of it like building a home on solid ground and purchasing a full suite of insurance coverage. In the same vein, organisations can buy a configurable controls solution that provides the benefits of a pre-packaged solutions with the flexibility of a building a custom controls solutions from scratch.

The optimal configurable controls solutions are designed based on best practices for each industry, incorporating capabilities for expansion and continuous growth to maximise long-term value. Capitalising on vendor expertise, they lower development, maintenance, training and audit costs without bogging down internal IT assets on complicated data quality projects. Doing so ultimately avoids homegrown mistakes without breaking the bank.

Implementing these smarter data quality initiatives is a growing concern for organisations, with 86 per cent of respondents now ranking it a priority, according to the Gartner research.

From data governance to risk management, data controls are the backbone of business data today. And while just a quarter of companies have quality standards enforced, it seems to finally be an increased priority.

The threshold today for data quality standards is too high to risk getting left behind; smart companies will use comprehensive data controls as a marker for competitive advantage moving forward.

In one way or another, companies should assess what makes sense for them now in order to implement data controls systematically across their organisation that are able to support their long-term business goals.

Jeff Brown is the product manager at Infogix, a provider of data integrity and data analytics solutions to market leaders around the globe.