Skip to main content

Storage – where we’ve been and where we’re going in 2017

(Image credit: Image source: Shutterstock/Scanrail1)

Through the last three decades we have seen several waves of innovation and product market shifts that have fundamentally changed the way businesses will architect their infrastructure. 

Back in the 1980s, storage was directly attached to a mainframe computer or internally attached to PC/Server which didn’t allow for easily pooling of resources, poor utilisation and trouble protecting, managing and scaling such solutions.   

In the mid-1990s, the biggest phenomenon was the emergence of network storage which was a massive driver for change, as it allowed resources to be pooled and shared to enable a higher degree of standardisation, performance pooling and data protection that was inherently difficult with the distributed direct-attached model.    

Even, the network storage market transitioned several times during this decade with several different architecture solutions; Monolithic Frame Arrays Vs Modular Storage Arrays, Storage Attached Networks (SAN) v Network Attached Storage (NAS), Fibre Channel v iSCSI storage.    

Fundamentally, this era was defined as using advanced RAID techniques and spinning Disk (of differing performance/cost profiles) to drive aggregation, standardisation and tackle age old data protection issues (backup, restore, replication, resiliency and business continuity) using software features that were encompassed into the storage platform. 

And finally, the arrival of flash in the mid to late 00s has probably been the single largest disruption to the storage market since the transition from direct-attached storage to network-attached storage twenty years previous. 

Flash adoption has really been driven by three key factors:   

-Performance/cost of flash in comparison to Disk-only based architectures (specifically when considering application response times)    

-The areal density of flash    

-The commoditisation of Flash, allowing for economic viability in mainstream computing (interestingly this commoditisation was driven from the consumer market)    

Over the previous 20 years the market players had been relatively static - dominated by large specialists, like IBM, EMC and NetApp. But innovation in flash storage is now causing a shake-up of the industry providing customers with simple, cost-effective ways to manage data.  This has led to the emergence of a number of disruptive brands, like Nimble, who are challenging the legacy competitors of the previous decades. 

We’re now seeing a changing of the guard. Over the past three years we’ve seen declining market share for established vendors with customers looking to more modern vendors instead of the traditional legacy providers. And this trend is only accelerating.  There are similar trends in the motor vehicle market, when new technology becomes available new brands come to the fore from innovation and challenging the established legacy (i.e. Tesla).    

Storage demands are also changing rapidly with the increasing number of big data applications requiring fast and efficient data delivery to these applications from growing data sets and also users and consumers’ intolerance for downtime or delay requires the supporting infrastructure has to deliver to ensure productivity and business is not lost or taken elsewhere.   

From fraud protection in the financial services industry, recommendation engines in retail reducing costs and driving up-sell, and an ever-greater adoption of the Internet of Things capturing and responding to data collected from potentially thousands of sensors, the need for data velocity has never been more important. And with these growing and increasingly complex demands on data storage and management, it is crucial that businesses are intelligently building and managing their infrastructure.    

In 2017, we’re going to see new technologies and trends further changing how enterprises manage their data to ensure the storage and infrastructure capabilities. Here are just a few that we’re expecting to see over the course of the next year.   

Data mobility   

In 2017, I believe that data mobility will become an even more important issue for the enterprise than it already is. 

Firstly, data that is stored within the cloud is currently exceptionally expensive to move between differing cloud providers. Secondly, because global organisations’ data now resides across multiple clouds and different on-premises locations, ensuring that data is readily accessible by applications and end-users is becoming an increasingly serious challenge.   

Brexit is looking like it will only further complicate navigating what are already complex data sovereignty laws in Europe. The ability to move data quickly and efficiently between locations and providers - and across borders -will become even more pressing for global companies. And ensuring that all data can be located and accounted for will be crucial to ensure that companies don’t begin facing fines with GDPR looming.   

Therefore, expect CIOs to place data security, sovereignty and mobility at the top of their ‘most important’ list in 2017. 


In 2016 there was massive hype around containers, but we’ve not yet seen this excitement translate into massive enterprise adoption.   

While the hype won’t change much from the previous year, in 2017 we can expect to see the turning of the tide in terms of adoption.  We’re going to see containers becoming increasingly mainstream in enterprise production environments.  Docker, Kubernetes and Apache Mesos are just three solutions that are beginning to gain traction, due to the important role that they play in ensuring containers are enterprise-ready. 

Their rise and the subsequent rapid adoption of containers will be a focal part of the IT market in 2017. 

Predictive analytics and artificial intelligence   

IT delays remain a fact of life for most businesses. Research, commissioned by Nimble Storage last year indicated that a typical employee experiences an average of four software caused delays per work day, each lasting about 7 seconds.   

These delays soon add up across an enterprise – and this ultimately impacts the wider economy,  with the cost of these delays amounting to a staggering £744,235,520 every year, when measured against the average hourly wage in the UK. 

That’s one of the reasons that we’ve seen the boom of interest in flash storage. IDC recently reported that the total all flash-based storage market generated a staggering $955.4 million in revenue during the final quarter of 2015, a 71.9 percent increase on the previous year. 

Storage often is not to blame for IT delays, in more than half of the cases examined by Nimble Storage, performance issues originated from complex infrastructures unrelated to storage,  such as issues with configuration and interoperability.    

In 2017, we’re going to see a growing consciousness of this issue among businesses as they release that investing in Flash alone isn’t enough. Intelligence and self-management will be a key differentiator in this year, separating the wheat from the shaft of storage companies.   

Predictive analytics and artificial intelligence help companies to sharply reduce downtime and ensure that applications perform optimally.   

This is essential for moving IT teams away from “firefighter mode”, and enable them to devise and provide a more proactive IT strategy.   

Over the course of the next year, it’s going to become painfully clear which industry players are holding fast to the “speeds and feeds” model, because they will be the ones with dwindling market share. 

Computing everywhere 

Tech infrastructure is still being built at a staggering rate. Yet, all too frequently little attention is paid to storage until something goes belly-up.   

With the emergence of applications and analytics, data has never been more valuable. And with the rise of flexible-working, companies are facing new challenges to make sure data is available when applications request it no matter the location. This makes data even harder to manage, because unlike the simplicity of networking and computing, data has gravity and takes time to move around.   

When planning their data management, many companies still don’t consider how storage is a hidden enabler. But as we embark on this new paradigm of ‘computing anywhere’, storage will only underpin more aspects of our daily lives.

Rich Fenton, Systems Engineer Manager, UK&I, Nimble Storage  

Image Credit: Scanrail1 / Shutterstock

Rich Fenton
Rich has been in the Storage industry for over 20 years, starting his career managing the storage and backup environment for Goldman Sachs before moving to the vendor community to be a Senior SAN consultant at StorageTek, NetApp and Nimble Storage. For the past 2 years, Rich has been leading the UK and Ireland Systems Engineering team at Nimble Storage.