2017: the year when hybrid clouds enter the mainstream

Here are a few thoughts on how businesses are going to transform the way they use hybrid cloud in the coming year.

We've been talking about the cloud for some years now and, while we may look back at 2016 as a year of growing enterprise cloud adoption, 2017 is likely to be the year when hybrid clouds really enter the mainstream. Cloud is no longer “optional” for enterprises looking to remain competitive: with businesses demanding ever-greater agility from their IT functions and with data growth continuing to explode at an alarming rate, more and more IT departments are looking to move critical IT services to a combination of private and public cloud. 

Research by Veritas in 2016 found that 38 per cent of workloads today exist in a private cloud, with 28 per cent in a public cloud. And these numbers are expected to grow at rates of 7 per cent and 18 per cent respectively over the next twelve months.   

With 2016 drawing to a close, here are a few of my thoughts and on how businesses are going to transform the way they use hybrid cloud in the coming year: 

Adoption of the cloud for business continuity and disaster recovery 

Until recently, there have been two significant inhibitors to the use of public / share clouds for disaster recovery. The first was concerns over data residency and the second was a lack of tools to effectively manage the migration of workloads from a primary data centre to the cloud. However, large cloud service providers like AWS, Azure and HP have now addressed the data residency issues in several jurisdictions by building local facilities. And several vendors, including Veritas, now provide toolsets to automate data replication and orchestrate workload migration. This is enabling businesses of all sizes, but particularly medium-sized ones, to consider cloud as a viable alternative to maintaining their own dedicated D.R. facilities, or to extend their D.R. provision to a broader range of production workloads, reducing both risk and cost.

The beginning of the end for tape storage 

The introduction of de-duplicating disk storage devices has already seen tape drives marginalised in many enterprise data centres, with tape storage used only for long term backup data retention. However, in 2017, businesses will finally begin to eliminate tape even for long term archiving. Businesses will now back up to the cloud for long term data retention. This will be driven by improving cloud storage economics, increased confidence in data security and the maturing of dedupe-to-the-cloud technologies. Tape management still costs businesses millions every year in transportation and storage costs, as well as expensive mechanical device maintenance. In 2017, we will see businesses free themselves from the shackles of tape storage, and enjoy cheaper and faster back up.

Multiple initiatives to drive GDPR compliance 

May 2018 will see the introduction of the European Union’s General Data Protection Regulation (GDPR) and businesses need to use 2017 to get prepared for complying with it. Businesses will need to be able to answer four challenging questions:

  • Do you have visibility and insight into the personal data you store?
  • Can you locate all of the information you hold on a data subject?
  • Can you supply that data to a requestor within a tight deadline?
  • Can you prove what you’re doing with that data and is it protected?

To be able to answer these questions will force businesses to embark on a long overdue data management journey, which in turn will result in additional benefits in terms of reduced data storage and management costs.

Now we’ve kicked the tyres of software defined storage, time to implement 

As mentioned above, 2017 will see an 18 per cent increase in workloads running on scale-out, private cloud infrastructures. These new, hyper-scale environments demand a new cloud-scale approach to data storage that is poorly served by current SAN-attached monolithic storage subsystems. Throughout 2016, enterprises have been exploring new, scale-out, software-defined approaches to storage management and 2017 will see them put them into production. These solutions will enable enterprises to harness the power of commodity hardware – servers, SSDs and hard-disks – to deliver cost-effective, policy-based, high-performance storage services – and will even enable them to merge on-premise and public cloud storage pools.

Big data platforms to become mainstream repositories of Enterprise Master Data 

In a move that surprised us this year, we had a number of customers ask about how to protect their Hadoop-base big data repositories. It seems that, with businesses running more analysis and uploading ever more data to these platforms, these platforms now bring so much value to the organisation that they are simply too valuable to lose. Indeed, these big data platforms are now working as repositories of enterprise master data and if the data gets lost or corrupted, it’s not simply a problem of uploading the data again, it’s a business-critical issue.

Veritas research conducted earlier this year confirmed that business-critical workloads in the cloud are set to double in the next two years. This means the movement towards the cloud shows no sign of abating. Data volumes are still growing exponentially, and this isn’t not going to slow down – it’s a problem that simply can’t be solved without the help of the cloud. With an ever-increasing choice of cloud providers, greater use of the cloud for data recovery and the acceleration for software defined storage, it’s certain that as we look back at the end of 2017 – the cloud landscape will look very different to now.

Peter Grimmond, Head of Technology EMEA, Veritas
Image source: Shutterstock/Nattapol Sritongcom

ABOUT THE AUTHOR

Peter Grimmond is Head of Technology for EMEA at Veritas Technologies.