In our current landscape, data is the single, most valuable asset that any organisation owns, regardless of size or sector. It has become a crucial part of doing business. By harnessing it effectively, companies can boost productivity and improve decision making, enabling them to stand out from the competition and offer real value to their customers.
But, in order to reap these rewards, companies must be able to understand the insights they are collecting. They must also be able to protect the data and ensure its quality so that any intelligence they glean from it can be used to inform wider strategies across the business.
This is why the role of Chief Data Officer (CDO) has seen such a meteoric rise in recent years, with the number of companies hiring one quadrupling since 2012. It is the CDO’s responsibility to both manage and protect all data, no matter where it is stored within an organisation. But it’s no easy feat.
Today’s enterprises hold a massive amount of data, all of which is likely to come in a variety of forms, each requiring different methods of attention. Some pools of data might be gathered via the Internet of Things or social media, for example, having an impact on how structured or unstructured the information is. In addition, it might be stored in a data lake or a data warehouse, changing the collection and analysis process altogether.
Add to this the introduction of modern regulatory requirements for the way that organisations handle their digital information, such as GDPR – which threatens huge financial and reputational risk for non-compliance – and it’s easy to see why CDOs are under immense pressure to succeed, despite having the odds stacked against them.
A helping hand
Over the years, data virtualisation has emerged as the helpful and necessary companion of the CDO.
This type of technology connects users and applications to a single logical view of all information within an organisation, no matter where it resides. Rather than replicating data and transitioning it to a new, consolidated repository, data virtualisation stitches together information abstracted from various underlying sources. This means that each time a user runs a query that retrieves data from two or more disparate sources – whether that’s from a BI data science or data virtualisation tool – the results are joined into a single, easily digestible response.
Data virtualisation also enables data to be conveniently accessed through front-end solutions, such as applications and dashboards, without the user having to know its exact storage location. It helps to accelerate the provision of data to consumers in formats that they can utilise effectively, whilst at the same time conforming to standard glossaries and policies. In a world where all business decisions need to be backed up by digital reasoning and evidence, data virtualisation is able to deliver curated insights to many different types of users for a wide variety of needs, providing the assurance that is needed.
Ultimately, data virtualisation enables a CDO – and therefore the wider organisation – to achieve better visibility and derive true value from their data whilst playing it ‘where it lies,’ accessing and joining it across multiple stores to create a logical data warehouse. This means that they can stop collecting and start connecting data, increasing both data quality and governance.
Ensuring quality and implementing governance
CDOs can utilise data virtualisation in several ways to ensure data quality whilst also driving data governance.
For example, organisations have long used both data lakes and data warehouses to store data. After all, both have their own separate set of benefits. But, if they are in different locations and use different technologies, it can be difficult for a CDO to combine data from each to cleanse and contextualise the data being stored within them. Both of these actions, however, are essential to ensuring quality and implementing effective governance policies. To add to this, if consumers and applications have access to both areas, then the CDO will have to enforce governance and security policies across both systems. This can be financially draining and make the process prone to inconsistencies.
Data virtualisation can combat this by providing a single access point to any data in the enterprise. This means that the CDO has a single-entry point to apply consistent data quality, security and governance policies across all data sources and consumers. In this way, data virtualisation allows a semantic layer to be created, whereby data is offered according to the formats and definitions specified in company-wide glossaries and policies, and data quality rules are checked and enforced.
This also grants the CDO heightened visibility when it comes to the quality of an organisation’s data, meaning that any potential data governance issues are flagged, and therefore dealt with before data delivery.
The future of the CDO
With statistics revealing that businesses with a CDO are twice as likely to have a clear digital strategy, it’s no wonder that the rise of the role has been so swift. And, seeing as the data explosion is not set to slow down any time soon, organisations are likely to continue increasing their investment, with the CDO set to play an even more crucial role in the boardroom moving forward.
However, in order for the CDO role to deliver on its promises, these individuals will need to transform data into true business value. Only then can data be used to inform decisions that will, ultimately, direct the future of an entire organisation. It’s likely to be a long road and not without its battles. But, by helping to ensure data quality and drive governance, data virtualisation is emerging as the strongest weapon in any CDO’s arsenal.
Alberto Pan, Chief Technical Officer, Denodo