Gartner listed blockchain as one of the top ten strategic technologies for 2020, so why are many enterprises still reluctant to adopt?
Distributed ledger technologies (DLT) have advanced beyond their initial implementation as a method to transfer digital value on the Bitcoin blockchain, and attempts have been made to introduce these technologies across enterprises worldwide as a way to sure up their operations in multiparty relationships .
Yet, the majority of applications of the technology haven’t made it to production.
This is because certain prerequisites need to be in place for the technology to be applicable and adoptable.
To understand why DLT adoption is lagging behind expectations, we need to understand the context of current multiparty data transactions. So how are things done at the moment and what are the specifics of data exchange using conventional technologies?
The centralized approach
In the traditional method, each counterparty has their own copy of data and this can be put down to four main reasons:
- Internal regulation
- Regulatory/legislative restrictions
- People unable to ensure predictable data quality
- Data privacy concerns
Overcoming these issues involves an excessive amount of labor to communicate and reconcile data between counterparties. Moreover, when mistakes are made, they are costly to resolve, with the costs even higher where actions are taken based on inconsistent data.
This brought about the need to provide a decentralized solution with predictable/controllable data quality, for use by all counterparties, with actionable data to control the lifecycle automatically, and data privacy through adjustable, role-based permissioning.
So why the hold up?
This is the promise of DLT. But the potential offered by the technology has thus far not been exploited for five reasons:
- Same siloed approach
- Compatibility issues
- No tools to manage data quality
- Scalability issues
- Lack of adequate development tools
Let’s break down what this means and explore some of the potential remedies.
Same siloed approach
DLT solutions to date are isolated solutions for different business areas. Even when they are created using the same blockchain technology, this doesn’t always allow for interoperation that can link up the different solutions within a single network. The network effects of increased participation and data sharing across networks is key to the successful implementation of distributed ledger technologies.
While it is encouraging to see different applications of DLT, there needs to be an environment in which different blockchains can share data and — ideally — interoperate with one another. Such a solution would necessitate a common public network and addressing system. Email is of no use if you can only exchange messages with your colleagues; it becomes a real tool for global message exchange when anyone can set up their own address and choose who they would like to communicate with. The same goes for DLT-based solutions.
As mentioned above, interoperability within DLT is important, but it can also be argued that compatibility with existing technology is even more central to full-scale adoption. Afterall, if businesses cannot transfer their existing data from conventional systems to DLTs then they are not going to adopt. Manual data transfer is out of the question and very few will be willing to start from scratch.
Just as we are all dependent on the movement of goods across the world with minimal friction, we also need to be able to transfer digital assets from one platform (or system) to another, whether that is something based on a blockchain or on conventional tech. This means that native support for inbound and outbound calls is required in order to connect up the different systems.
No tools to manage data quality
Dirty data is a problem for all organizations, but this problem becomes exacerbated when we consider multi-party business transactions and ecosystems. Inconsistent, incomplete, and inaccurate data has been estimated to cost the US alone over $3 trillion. While DLT shows promise to help overcome these losses, current platforms don’t offer solid data governance tools, without which data management becomes chaotic and data quality is reduced.
Cost of inconsistent, incomplete, and inaccurate data: $3 trillion in US alone
DLT platforms need to provide functionality which provides not only for GDPR-like erasures, but also offers message and data encryption and signing based on role-based access control to bring about greater accountability. Moreover, the ability to unify a DLT database via the use of multiple legally recognized cryptographic algorithms simultaneously in one network would avoid unnecessary data replication.
The age-old problem of scalability. Innovation can bring about significant change in the way things are done and facilitate a transition to a completely new world. The problem is: how can technologies be scaled up to cater for increased demand and make them much more useful? DLT platforms have faced scalability issues in terms of data storage and throughput.
Whether DLT-platform scaling can be made possible via novel protocols and dynamic consensus mechanisms, or through network architecture that implements a separation of concerns for different computations, one thing is for sure: scalable DLT is a prerequisite for the technology’s adoption. The network effects of DLT mean it has to be scalable.
Lack of adequate development tools
Developing blockchain applications is made no more easier by the non-industry-standard programming languages that need to be mastered to build decentralized applications. This is a major barrier to adoption as there just aren’t enough developers with the right skills to build DLT apps. Furthermore, the immutable nature of smart contracts doesn’t pair well with the ever-changing business world.
Being able to develop apps on DLT platforms using common programming languages such as GoLang and Java support would facilitate faster application creation. Other tools such as templates for custom development, troubleshooting of smart contracts and accompanying business logic (e.g. middle layer, external services), in addition to development, debugging, and end-to-end testing would boost deployment frequency.
Road to adoption
Simplicity is the key to adoption for many technologies and DLT is no exception here. Its complexity is one of the major blockers to adoption, while the fact that incumbent platforms are in some way derivatives of the architecture of the original 2008 Nakamoto paper means the technology remains nascent.
A new, business-oriented approach is needed in the creation of next-generation DLT platforms, with the technology not just looking to solve the centralization issues as initially promised, but offering a complete, scalable toolkit that caters to enterprise needs.
Andey Zhulin, founder and CEO, Insolar