Data disaggregation: What, where, and why?

Today it’s predicted there are over 20 billion smart connected devices in the world. By 2020, experts anticipate this number will rise to over 50 billion – all developed to generate, collect, analyse, and share data.

A recent report by energy analyst Peter Kelly-Detwiler noted that while only 22 per cent of this information is currently candidate for analysis – i.e., useful if tagged – by 2020, the useful percentage could grow to more than 35 per cent. It is this data, appropriately analysed and acted upon, that has enormous potential economic value to society.

So, how can businesses separate the useful from the useless, or set parameters between the business critical and the nice to have? And, once distinguished, how should these data sets be handled differently?

All data is not equal

While all data are expressed in 1s and 0s, the differences arise in what those numbers represent and how businesses want to use them.

Some data is meant mainly for storage, some data needs immediate access, and some data might just hold the cure for cancer or the prevention of the next major financial crisis and for that reason should be analysed against whole sets of other data to understand all pertinent variables.

However, in today’s IT-directed enterprise, all data is treated equally in terms of how and where it is stored. Though, in an environment where budgets are decreasing, IT decision-makers should be questioning whether it really makes sense to pay a premium for data centre space, power, and connectivity if it‘s not necessary?

Do the economics add up if spending roughly 80 per cent less for power can free up IT resources to be spent more efficiently on other parts of the organisation? Would having compute resources in a location that offers fixed, long-term pricing be a better means to keep operational costs in check than a location – like London – that recently saw power pricing jump to over £1,000 per kilowatt hour?

If this seems reasonable, IT leaders would be wise to look at their respective data programmes and identify the applications that require local housing (based on application and usage requirements) and those that can be stored at a location that has abundant, low-cost, stable power.

Data disaggregation

Disaggregating data in this way is an approach particularly suited to High Performance Computing (HPC), Big Data analytics, and storage workflows.

These applications are currently at the heart of many IT programmes across dozens of industries focused on solving issues and better understanding climate change and weather mapping, genome sequencing, automotive safety, and risk analysis.

Looking again to Kelly-Detwiler’s report:

'Relocation would result in lower operating costs and less exposure to the risk of power supply disruption. It would also help to take some stress off the UK power grid, providing planners with more time to develop cost-effective options. While some data centres absolutely need to be located close to customers and support applications, this holds true mainly for financial transactions that require minimal latency. Research undertaken by consulting firm The Broad Group indicates that even in the financial sector, only 10-15 per cent of applications actually require proximity to the London Stock Exchange.

'Today, more and more individuals are increasingly comfortable with the cloud and the knowledge that their data can literally be stored anywhere and shipped in milliseconds (for example, Verizon indicates it can move data from the US to Europe and back in 90 milliseconds). As a consequence, there is little true economic justification for keeping most data centres close to home. This is truly a global market where cost and reliability matter far more than proximity for the majority of data centres in the UK or elsewhere.

'In fact, one estimate suggests that 50 to 60 per cent of data centres could relocate outside of the UK without any negative business or regulatory impact. Indeed, such data centres moving to the Nordic countries or Iceland, for example, would enjoy both lower costs and more robust electricity infrastructure.'

Mission critical vs. nice to have

Today’s global organisations will recognise the issues raised above aren’t exclusive to the UK – they’re concerns shared across the globe.

The need for a reliable, stable grid is critical for supporting a global, data-driven economy. By moving applications off grids that are at or near capacity – or that are simply unstable – forward thinking IT decision makers remove a great deal of risk from their IT resource calculations.

And of course, given the budget constraints that are a reality for IT departments around the world, the question IT leaders need to ask themselves as they look at where to place compute resources in future is, 'Am I paying the price for mission critical when what I really operate is simply nice to have?'

Jorge Balcells, Director of Technical Services at Verne Global