Every so often in technology, there is a development that creates a chain reaction that starts slowly and then gains unbelievable traction, and it changes everything we touch – for example the semiconductor transistor. From its humble beginning as a single transistor on a chip of germanium 69 years ago, to what we consider normal with gazillions of transistors in the average microprocessor. Then again, every so often there is a perfect storm in the computing world that predicates a quantum shift in the way everyone approaches things. Change is indeed a constant process, which can be hard to keep up with – and it’s not easy to realise how it will impact customers, markets and businesses.
Financial services companies – including banks, for example, are entering one now with big data, the cloud, security, high speed WAN connections and the changing data sets we use. In addition to the need to innovate increasingly digital services, they have to cope with the increasing burden of legislation and industry regulations. Yet technology is also at the heart of the disruptive forces that are calling for change, and they are increasing our consumption of ever larger volumes of data. In a business context, data must now be more protected than ever as it is the fuel that drives the financial performance and service delivery engine of most organisations.
More generally, Joao Lima wrote on 4th May 2016 in an article for Computer Business Review that 'disruption in the banking and insurance sectors is being led by consumers, with Brits at the forefront of transformation in Europe'. Fujitsu UK and Ireland’s ‘A New Pace of Change Report’ has also found that '37 per cent of consumers in Europe are willing to consider leaving their providers if they do not offer up-to-date technology, and 39 per cent of those surveyed in the UK said they would leave their provider if their digital demands were not met', reports Lima. In many cases, this demand also applies to the commercial sector because innovative technology and new ways of working can create an opportunity to find a competitive advantage, and to make an organisation more efficient and productive.
Just a few years ago things were very different in the computing world. Data was all about databases, Office files, and slow expensive wide area networks (WANs). No one suspected that certain agencies and governments and other hostile 'parties' were regularly intercepting our data. Data was generated in the datacentre and consumed there too. Moving data between sites, or from remote offices, was all about squeezing as much data over that WAN by compressing the data down by using deduplication techniques. With the file types in use at this time, this was a very successful technology.
Fast forward to the present, and the world has changed. Now we have so much data under management, we are constantly creating new superlatives to describe the vast amount of data we now have - the latest, ‘data ocean’. However the type of data we are now storing has changed dramatically. It no longer consists of just typical databases, Office files, etc. Data is now about images, video files, pre-compressed and encrypted files. There are also applications that use deduplication as ways of reducing the amount of data storage they consume.
The perfect storm
Now, if we add in the use of the cloud to the mix, we have a perfect storm that our current method of moving data over the WAN cannot cope with. Oddly though, an article in Banking Technology magazine suggested in November 2015 that banks are still resisting the cloud. Elliot Holley wrote in his article, ‘No cloud please, we’re bankers’, 'Despite predictions over the last few years that banks were just a heartbeat away from adopting cloud technology, only 1% of banks are actually running core processing in the cloud today, according to a new report by Temenos [and Cap Gemini].'
He added: 'On one hand, the 2015 survey reveals that 89% of institutions are now running at least one application in the cloud, which compares against just 57% back in 2009 when the question was asked for the first time. However, the report notes a continued reluctance to run core banking applications in the cloud. Reasons cited included reticence about putting the bank’s most sensitive data in the cloud, with 34% citing concerns around data security.'
Yet Joe Curtis highlights on his Big Data Made Simple blog that banks have a huge big data opportunity, and much of the data analysis of non-sensitive data is likely to occur in one type of cloud or another. He also reveals how much data is being created by the banking sector:
'To get an idea of the sheer magnitude of data being created and collected these days, consider this estimate by industry analysts: 2.5 quintillion bytes of data, meaning a figure followed by 18 zeros, are created on a daily basis. A great portion of all this information is collected analysed in accordance with the principles of Big Data, particularly with regard to stimulating innovation, improving efficiency and raising the levels of competitive enterprise.'
While increasing data volumes undoubtedly offer financial services organisations such as banks and insurance companies an opportunity to understand their markets and customers better, the transmission of data from its source to the appropriate business intelligence and data analytics solutions can be hindered by the very same storm. Network latency can strike like lightning and it can make the prospect of real-time analysis, data back-up and restore much harder to achieve.
From a big data analysis perspective, it could make the difference between painting an accurate picture of what’s contained in the data, and a skewed one. This could make it more difficult to understand customers, to develop new strategies based on accurate data insight, and to develop new products and services to increase corporate profitability.
In a disaster recovery context, it’s important to have a solution in place that can enable service continuity as well as business continuity. Banks and other financial services organisations shouldn’t rely, for example, on just having one datacentre. They should have at least two, located outside their own respective and potential circles of disruption. Yet most of them aren’t, because latency makes it harder for organisations to transmit the ever increasing data volumes at speed. To enable fast data back-up, retrieval, and service continuity, they should therefore consider innovative solutions to mitigate the effects of network latency.
In effect, by embracing new solutions, new business practices, and new processes, organisations of all types can address the challenges they face. They can also protect their businesses by investing in solutions that prevent human-made or natural disasters from damaging their customer relationships and reputations too. With technology that can speed up data flows, they will gain the opportunity to analyse data in real-time to ensure they can gain the upper hand over their competitors and enable them to create corporate strategies that address the needs of their customers and markets.
It’s also worth noting that legislation and regulation can also inspire the need to innovate by implementing new technologies as well as new business models. Banks and other financial services organisations should perhaps see the cloud, plus its legislation and regulation, more as an opportunity than as an obstructive and costly hurdle. This change could enable them to attract and retain customers.
David Trossell, CEO and CTO of Bridgeworks