When big data is a big headache: addressing the challenges to reap rewards

null

It’s perhaps no surprise that the collection and intelligent use of data has reached the top of the agenda for many organisations - and data intelligence has firmly escaped the shackles of being just “an IT issue”.  A growing majority of businesses are now seeking to leverage data as a critical strategic asset, helping to uncover new sources of business value – and “big data” sits firmly at the boardroom table.

Whilst some argue that the term “big data” is overused – or misused to mean the kind of data analysis that could be done on a single desktop computer, when it should really be reserved for applications requiring serious processing power – an argument over terms won’t change its lasting effect on business culture.

It is widely agreed that the potential benefits of data analysis and application are extensive, but the fundamental changes to business processes it necessitates hasn’t come without headaches. As executive awareness in the potential power of data started to take hold, businesses struggled with how best to organise around data - as an activity, a business function, and a capability. For some companies this led to a new role in their organisation - the equivalent of a “data czar” – which has come to be known as the Chief Data Officer (CDO).

But for us, the real issue is more fundamental than “who owns the data” – and still sits with the IT savvy within an organisation. Companies need to know how to extract most intelligence from data and how to make the process as smooth as possible.

Storage, processing and delivery

Fundamentally, data needs to be stored, processed and delivered in a meaningful way so that it can be used effectively.

 

One of the key characteristics of big data applications is that they demand real-time or near real-time responses. Financial applications need to give traders information on commodities quickly, in order to make buy or sell decisions – and succeed in a high velocity and competitive industry.  Police departments are increasingly accessing data to give real-time intelligence on suspects, or to make an immediate impact in an investigation or crime scene that wouldn’t have previously been possible.

Data volumes are growing very quickly - especially unstructured data – at a rate typically of around 50 per cent annually. Domo, an American computer software company specialising in business intelligence tools and data visualisation, recently released their fourth-annual “Data Never Sleeps” report, which highlights the fact that data is now more ubiquitous than ever – and revealed that  more than 1 million megabytes of data being generated by mobile connections every single minute of every  day.

All this means intense pressure on the security, servers, storage and network of any organisation - and the impact of these demands is being felt across the entire technological supply chain. IT departments need to deploy more forward-looking capacity management to be able to proactively meet the demands that come with processing, storing and analysing machine generated data.

Historically, for a data centre to ‘meet new needs’, it would simply add floor space to accommodate more racks and servers. However, the demands for increased IT resources and productivity have also come hand in hand with increased need for higher efficiencies, better cost savings and lower environmental impact. High Performance Computing (HPC), once seen as large corporation, is now being looked at as a way to meet the challenge and is requiring data centres to adopt high density innovation strategies in order to maximise productivity and efficiency, increase available power density and the ‘per foot’ computing power of the data centre.

A flexible approach 

Many companies have turned to outside organisations to help them meet these new challenges. Of course, the IT industry is devoted to designing innovative tools and techniques to keep up with the rapid evolution of tech trends like big data - and tech vendors already offer a multitude of solutions to the capacity and complexity problems.

Third party colocation data centres have increasingly been looked at as the way to support this growth and innovation, rather than CIOs expending capital to build and run their own on-premise capability.

For many, cloud computing is an HPC user’s dream offering almost unlimited storage and instantly available and scalable computing resource. For us, the cloud is compelling, offering enterprise users the very real opportunity of renting infrastructure that they could not afford to purchase otherwise – and enabling them to run big data queries that could have a massive, positive impact on their organisations’ day to day strategy and profitability.

However, one size doesn’t fit all. Organisations need to take a flexible approach to storage and processing. Companies must choose the most appropriate partner that meets their pricing and performance level needs – whether on-premise, in the cloud or both - and have the flexibility to scale their storage and processing capabilities as required. They must also make sure they aren’t paying for more than they need and look for a disruptive commercial model, which gives absolute flexibility - from a rack to a suite, for a day to a decade.

The big security challenge

It’s maybe obvious that the more data which is stored, the more vital it is to ensure its security.

Malicious attacks on IT systems are becoming more complex and new malware is constantly being developed – and unfortunately - companies that work with big data face these issues on a daily basis. A lack of data security can lead to great financial losses and reputational damage for a company, and, as far as big data is concerned, losses due to poor IT security can exceed even the worst expectations.

When it comes to security, businesses will generally not slow down the needs and progress of the business to have an ideal security posture. Whilst security catches up with the speed of the big data revolution, organisations are potentially more vulnerable.

Many turn to colocation to complement a cloud solution, developing a hybrid approach to meeting storage needs.  These organisations recognise that moving into a shared environment means that IT can more easily expand and grow, without compromising security or performance.

By choosing colocation, companies are effectively renting a small slice of the best uninterruptable power and grid supply, with back-up generators, super-efficient cooling, 24/7 security and dual path multi-fibre connectivity that money can buy - all for a fraction of the cost of buying and implementing them themselves.

Big data, big centre?

So, the demands that come with big data mean that, ultimately, the data centre now sits firmly at the heart of the business. Apart from being able to store machine generated data, the ability to access and interpret it as meaningful actionable information, very quickly, is vitally important and will give huge competitive advantage to those organisations that do it well.

Getting the data centre strategy right means that a company has an intelligent and scalable asset that enables choice and growth. However, get it wrong and it becomes a fundamental constraint for innovation. Organisations must ensure their data centre strategy is ready and able to deal with the next generation of computing and performance needs - to remain not only competitive and cost efficient, but also ready for growth.

So when it comes to big data, a new approach is needed. Whilst organisations which are already collating and storing large sets of data - intelligence is only power if it’s used.  The IT industry has a vital role to play in helping organisation to realise these ambitions.

Darren Watkins, managing director for VIRTUS Data Centres
Image source: Shutterstock/wk1003mike