Big Data, IoT and the need for high density and ultra high density computing

Big Data and IoT have long been heralded as the next revolution within the IT world. Beyond the headlines of connected devices - and customer behaviour analysis - IoT and Big Data are being used to solve increasingly complex business problems. Digital businesses are turning to IoT technology to manage the connections, devices and applications that make up their organisation. Automated workflows - which have long been a watchword of manufacturing business strategy - are being embraced by many disparate organisations.

IoT and big data are clearly intimately connected: billions of internet-connected 'things' will, by definition, generate massive amounts of data. The IoT industry generates ‘big data’ to take all of the information that it gathers and turn it into something useful, actionable - and sometimes - automated. Whilst on the flip side, IoT provides a wealth of data, which with compute processing and intelligence, can generate invaluable insight for organisations to use.

And, although the future seems expensive for these innovative technologies, for many, the possibilities are limited by issues of complexity and capacity. The benefit of IoT and big data will only come to fruition if businesses can run analytics that – with the growth of data – have become too complex and time critical for normal enterprise servers to handle efficiently.

The big capacity challenge

IoT and big data put intense pressure on the security, servers, storage and network of any organisation - and the impact of these demands is being felt across the entire technological supply chain. IT departments need to deploy more forward-looking capacity management to be able to proactively meet the business priorities associated with IoT connections. And big data processing requires a vast amount of storage and computing resources.

All this means that, ultimately, the data centre now sits firmly at the heart of the business. Apart from being able to store IoT generated data, the ability to access and interpret it as meaningful actionable information - very quickly - is vitally important, and will give huge competitive advantage to those organisations that do it well.

At VIRTUS, we believe that getting the data centre strategy right means that a company has an intelligent and scalable asset that enables choice and growth. But - get it wrong and it becomes a fundamental constraint for innovation. So organisations must ensure their data centre strategy is ready and able to deal with the next generation of computing and performance needs - to remain not only competitive and cost efficient, but also ready for exponential growth.

High performance computing 

Of course, the IT industry is devoted to designing innovative tools and techniques to keep up with the rapid evolution of tech trends like IoT and big data - and tech vendors already offer a multitude of solutions to the capacity and complexity problems.

High Performance Computing (HPC), once seen as the reserve of niche verticals such as Education and Pharmaceuticals, is now being looked at as a compelling way to address the challenges presented by IoT and big data.  HPC has presented significant challenges in recent years - such as the scalability of computing performance for high velocity, high variety, and high volume big data, deep learning with massive-scale datasets - but the benefits are increasingly clear, and not just within a few key verticals.  Data centre managers are now looking to adopt High Density innovation strategies in order to maximise productivity and efficiency, increase available power density and the physical footprint computing power of the data centre.

Indeed High Density Computing (HDC) also addresses an important cost element - a crucial concern as complex tech developments mean that storage and power requirements spiral. HDC offers customers the ability to consolidate their IT infrastructure, reducing their data centre footprint and therefore their overall costs. The denser the deployment, the more financially efficient customer’s deployment becomes.

Finding the right provider

We know that the processing requirements to meet the demands of IoT and big data, combined with cost mitigation, is accelerating the need for HPC. But many organisations may find the public cloud ill-suited to delivering the right platform. However, we believe that the answer is not to design and build a highly expensive owned data centre - that will age rapidly and become inefficient - but instead look to the colocation providers who understand the specialised needs for HPC.

 

Being able to support High Performance Computing in the data centre has become the new battleground for colocation providers - and high density capability will be crucial for businesses deciding which third party data centre to use. We think that organisations need to look closely at these capabilities. If High Density has been designed ‘in’ from the beginning, it provides the ability to support the next generation of businesses IT infrastructure for High Performance Computing -  optimising the data centre footprint required and the overall associated costs. This means that irrespective of whether existing data centres take steps to offer High Density, they are playing catch-up with a next generation of intelligent data centres that already have this capability.

Providers that are working to upgrade legacy data centres for Ultra High Density are facing a more difficult task. Although the concept of High Density is straightforward, it involves a lot more than simply main-lining more electricity into the building. It’s essential that before a data centre can support this requirement that it has a robust and fit-for-purpose infrastructure in place. High Density not only requires increased quantities of power per cabinet, but also next generation cooling capabilities, which are extremely difficult to retrofit. Advanced cooling is essential as more energy consumption and harder working servers naturally equate to more heat.

So, whilst we understand that making the right choice is not simply about the data centre, it is also about making the right High Performance Computing platform choice - and it’s important that organisations ask those tricky questions of providers - about infrastructure, cooling and energy consumption - before they sign on the dotted line.

Future proofing your business

This article has, rightly, seen us looking at the technology behind these innovations. Overall, the choice for businesses is a stark one which sits at strategic level. Commercial industry been radically changed by the application of digital technologies, and digital disruption means that companies can no longer be complacent. They can either seize the opportunity that IoT and big data offers - like game-changers Netflix or Instagram - or see their business disappear.

While many industries have embraced this crucial opportunity to adopt IoT and big data technology, businesses who don’t get the basics right, will ultimately struggle to remain competitive on every front. And the key component to success is to ensure that the data centre is equipped to handle the rigorous demands which technology innovations place on them. Organisations must look to the right data centre partner to help their business succeed, and to new technologies like HPC and HDC, to help meet these demands.

Darren Watkins, managing director, VIRTUS Data Centres
Image source: Shutterstock/wk1003mike