With more than 90 per cent of all data ever created being generated in the last few years, and analysts forecasting that by 2025 humans will generate 180 zettabytes of data annually -- modern organisations are facing a serious challenge in reliably storing unprecedented volumes of data, and subsequently, gaining value from that data to drive informed business decisions. Businesses are moving beyond the age of virtualisation, towards a cloud era that requires scaling to the data demands of a vast range of business applications. But are they ready? We spoke with Alex McMullan, EMEA CTO of Pure Storage on how businesses can prepare for the technologies of tomorrow.
1. What in your opinion are the key technology challenges modern enterprises face?
Across industries, the same question is being asked time and time again – ‘how do we get the most value from our data?’. The pace of technological advancement is growing fast, and consequently the volume and velocity of data is growing at an exponential rate. The problem is that the majority of businesses just aren’t keeping up.
As we shift from cloud computing architecture towards greater reliance on edge computing, the data demands involved are putting substantial pressure on networks. The datasets required to sustain the Internet of Things (IoT), for instance, are getting bigger but the challenge of collecting the data from thousands or millions of data sources and moving that data to the compute elements is proving tremendously difficult given existing network capabilities.
And this is only the beginning – over the next decade the business environment will be almost unrecognisable. Everything that can be connected will be, bringing greater efficiency to business processes, but the data involved in sustaining this will be enormous.
In the increasingly digital economy, the key is putting data to work in a way that enables and ensures speed, insight, and agility.
2. What is the scale of the problem – just how much data are we talking about?
At the moment businesses are struggling with terabytes and petabytes of data. As we move deeper into the cloud era, and the range of business applications grows, it’s only a matter of time before companies are trying to come to terms with exabytes and even zettabytes of information.
To put into context, a zettabyte equates to 250 billion DVDs of information – an incomprehensible amount of data. Yet, this is the scale of the challenge businesses are facing.
While this clearly places pressure on the technical side of the C-suite in how they harness and manage this data, making it accessible and secure, in reality the implications of failing to derive value from all this data will impact everyone in the boardroom.
3. Can you give any examples of industries or sectors where the issues are particularly acute?
There’s scarcely an industry that isn’t touched by data, but some are clearly more impacted than others. Financial services – a sector in which I worked in for many years - is an obvious one. Areas like trading, banking and insurance have always been to a large extent data-oriented, but the pressures to deliver more business value at reduced costs only intensifies. As such, some businesses operating in these sectors have invested heavily in infrastructure to be prepared for the onslaught of data, and use it as a strategic asset to serve the business with high performance. It’s those businesses that have been slow to react that are most at risk of being displaced or disrupted.
Other industries worth mentioning include web and SaaS-based businesses. The data demands of these areas are immense with trillions of data points being processed each day. Speed is of prime importance to these sectors as web and software application users simply won’t tolerate high latency. The higher the latency, the greater the risk of customer churn.
4. For years enterprises have been tackling big data, with varied success. As innovations in artificial intelligence and machine learning become a growing reality for businesses, what specific challenges does this pose?
There has been a clear shift in recent years in how these innovations are understood and applied. For a long time, artificial intelligence felt a nebulous concept. Now we’re seeing examples of machine learning in many aspects of our daily life, and will only become more accustomed to such technologies as chatbots and AI assistants become mainstream consumer technologies.
In a business context, companies are pushing the boundaries of what’s possible through AI. The world’s largest technology companies, including Google, Facebook and Amazon, are collectively investing billions in artificial intelligence. Turning the pools of data available to us into the fuel for machine intelligence and automation is the next frontier of human development. We are just at the beginning of leveraging machine learning and advanced analytics to take data insights beyond even human prediction.
This is a two-sided coin. On one hand the potential of AI to address macro challenges in critical areas such as healthcare and genomics is immense. On the other, it requires seriously strong infrastructure to underpin the innovation. Take the automotive industry for example – for autonomous cars to become a mainstream part of our lives will be a data-heavy task. Cars will need to be fed vast amounts of data to operate efficiently and safely. This industry will become a pool of data that needs to be managed.
5. What do you see as the potential for businesses that are able to get this right?
The prize is an industry advantage, which in the current business climate is well worth its weight in gold. Putting data to work brings high performance, efficiency, intelligence and the potential for substantial cost saving. All of which underpin the successful modern enterprise.
The ability invest in a simple, scalable, high-performance data centre environment is one of the most powerful strategic assets any firm can possess. It frees the ability to drive innovation across the business, especially when it comes with a cost model that allows the spend on infrastructure to be inverted – indeed, according to Gartner, it takes just 5.4 months on average to recover investments made in solid-state storage arrays – will help reinforce the business’s longer term prospects.
6. What does cloud-era flash storage look like?
In the evolution of technology, we’re moving from vitualisation to a cloud world. The application infrastructure has changed dramatically, bringing a range of new challenges and requirements. This applies equally to the storage industry, which has conventionally evolved to match the infrastructure needs of each generation.
The cloud era requires a solution that meets the new set of demands for storage, providing a high degree of performance in mass. Businesses need to act fast, so cloud-era flash storage must be able to handle data at speed, as high latency and data bottlenecks are going to have negative impact across the business.
A cloud-era solution therefore focuses on exponentially expanding the value of data through faster and more reliable access. This enables businesses to build a new class of applications and to extract new insights from data. Doing this requires deepening integrations with the next generation applications and platforms, and tailoring solutions to the developers of tomorrow’s clouds and applications.
It is with this vision in mind that we’ve launched FlashArray//X – our enterprise class solution for the cloud era. The product is the first 100 per cent NVMe enterprise all-flash array, and is built on DirectFlash, the first software-defined flash module. This delivers on the needs of the enterprise of tomorrow, with remarkable results on latency, bandwidth and density.
7. Why is flash storage so crucial for modern enterprises?
If you look at how the storage industry has developed over the past decades, we were operating on spinning disks that got bigger but not faster. Flash disrupted spinning disks by offering a faster, more reliable solution.
Legacy forms of storage still pervade the market, yet are fundamentally unsuitable for modern data requirements. In most data centres today, data is the key bottleneck. It’s where compromises are ultimately made: how much to store, how much can be analysed, how long to wait for answers – compromises no one wants to make.
Flash has continued to develop and remove the barriers created by legacy storage, opening a new realm of possibility for technology advancement. And this flash revolution is far from over: As flash-based products become more readily available, the question for the majority of organisations is not whether but when they deploy an all-flash storage solution.
8. Where do you see the storage industry going over the next 10-15 years?
The industry has clearly undergone a lot of changes over the past decade. Storage providers have come and gone, and there’s undoubtedly been a lot of consolidation in the market. Those that have taken leadership positions in the industry, as with many industries, are those that have led the innovation.
The fact remains that people don’t want to talk about storage, they want to talk about data, applications and cloud. And this is where we need to take the conversation. I wish I had the ability to predict where the industry will be in 15 years, but in this industry that’s a near eternity.
Alex McMullan, EMEA CTO, Pure Storage
Image Credit: Everything Possible / Shutterstock