Skip to main content

Conflicting forces – Finding a balance in the technology environment of 2016

As we enter 2016, we can look back once again on a year of a rapid change. Data continues to grow exponentially. According to EMC, by 2020, the digital universe will grow 300 times from the present 130 exabytes to 40,000 exabytes while, according to Hortonworks, Big Data is worth approximately $27 billion in 2015 and is expected to be a $100 billion market by 2020. To some though it continues to be a case of the more things change, the more they stay the same. Issues around encryption, data security and data sovereignty will remain very high on the agenda over the next twelve months.

A recent Breach Level Index report reveals that there were 888 data breaches in all in the first half of the year alone, compromising 246 million data records of customers’ personal and financial information worldwide.

Security challenges

The perception is that hacking is becoming more prevalent and that alone is keeping data security and the need for encryption high up on the agenda both of the IT department and around the boardroom table of most businesses today. The need for basic two-factor authentication to protect password access is rapidly becoming a given as employees continue to ignore best practice and use weak passwords or reuse the same ones across multiple accounts. The message for business has to be they need to start encrypting their data today.

Related to this is the vexed question of data sovereignty. The recent ruling on Safe Harbor raises serious issues for any business that keeps data in the cloud of course. The fact that both Amazon Web Services and Microsoft are now building data centres in the UK shows that both of these key public cloud providers are taking this issue seriously and making significant investments with the aim of keeping sensitive corporate and personal data on these shores. Whether or not this approach will ultimately succeed, however, remains a moot point. A likely EU referendum this year may add further complexity and delay to resolving data sovereignty issues, especially if ‘Brexit’ becomes a reality.

With accessible volumes of data exploding, the demand for flexible infrastructure and network solutions that enable organisations to better use data to achieve operational insight and advance strategic goals will become increasingly difficult to resist.

Driving agility

We expect to see the development of “cloud native” applications growing strongly over the coming year and with it the use of platforms such as Cloud Foundry or Eucalyptus. Coupled with a DevOps work process, developers and administrators are increasingly managing live environments. Providers with long-term aspirations need to start using this technology now and move away from legacy platforms.

Linking in with the theme of technology enablement, another area of focus in the coming year is likely to be around application containerisation. Application packaging and the ability to deploy these on an IaaS cloud, will change the way some users consume IT Services. With companies no longer able to charge for OS, it could lower the cost of application ownership, into the bargain.

Looking to the future, there are a raft of other business-enabling technologies whose development is expected to gather pace during 2016, including the Internet of Things. Indeed, a recent report from BI Intelligence projects there will be 34 billion devices connected to the Internet by 2020. IoT is rapidly moving from blue sky thinking to a genuine commercial powerhouse.

Quantum computing is another technology likely to develop in 2016. A team of Google and NASA engineers announced towards the end of 2015 that a large black box called D-Wave 2X Quantum Computer, acquired by the two organisations back in 2013, had come up with an answer for an optimisation problem in seconds alone, using a processing speed that is 100 million times faster than that of the average computer chip.

Indeed, Google Director of Engineering Hartmut Neven recently claimed that what the D-Wave 2X can process in a second is something that a single-core classical computer could only solve in a span of 10,000 years. Certainly, quantum CPUs and processing will ultimately offer huge potential both in a research and academic context but ultimately also in a commercial respect also. When this technology is released onto the mass market, it is likely to change every aspect of CPU-based devices while also transforming the wider business environment.

In 2016, we are witnessing a technology environment that is characterised by competing forces. On the one hand the need for prudence and caution remains paramount as businesses look to protect their data from the potential for attack and continue to debate issues around ownership and sovereignty. On the other, we see an ongoing push for solutions that drive business agility and operational efficiency, faster processing and ultimately the faster time to insight that delivers competitive advantage.

Simon Michie, CTO at Redcentric

Image Credit: Shutterstock / wk1003mike