Big Data: Time for new approach to analysis

This article was originally published on Technology.Info.
As part of our continuing strategy for growth, ITProPortal has joined forces with Technology.Info to help us bring you the very best coverage we possibly can.

The Big Data problem is accelerating, as companies get better at collecting and storing information that might bring business value through insight or improved customer experiences. It used to be a small specialist group of analysts that would be responsible for extracting that insight, but this is no longer the case. We are standing at a nexus between Big Data and the demands from thousands of users – something that we call “global scale analytics” at MicroStrategy. The old architectural approaches are no longer up to the task and this new problem needs radical new technology. Continuing with the old approach Big Data will fail to reach its true potential, and just become a big problem for companies.

Analytics applications now regularly serve the needs of thousands of employees to help them do their job; an employee can need access to hundreds of visualisations, reports and dashboards. The application must ready for a query at any time, from any location and the results must be served to the user with ‘Google-like’ response times; their experience of the web is the benchmark by which they judge application responses in the work environment.

With this huge rise in data and user demands the traditional technology stack simply can’t cope, it is becoming too slow and expensive to build and maintain an analytics application environment. Sure, there are some great point solutions, but the problem is the integration between every part of the stack – the stack only performs as well as its weakest link.

The industry has only been working to solve half the problem, data collection and storage, rather than looking at the full picture which also includes analytics and visualisation. Loosely coupled stacks scale poorly and have a huge management and resource overhead for IT departments, making them uneconomical with poor agility.

Looking at the end-to-end Big Data analytics problem requires an architecture that tightly integrates each level of the analytics stack, taking advantage of the commoditisation of computing hardware to deliver analytics that can scale with near perfect linearity and economies of scale, to deliver sub-second response times on multi-terabyte datasets.

MicroStrategy has become the first, and currently only, company to make this approach a commercial reality, tightly integrating data, analytics and visualisation. PRIME is a massively parallel, distributed, in-memory architecture with a tightly integrated dashboard engine. Companies like Facebook are using PRIME to analyse billions of rows in real-time, proving that this new approach is pushing the point solutions of the past out of the spotlight.

Regardless of your application, if you have thousands of users, exploding data collection, highly dimensional data, complex visualisation or globally distributed user base, then the big data problem will keep getting bigger. With every day it grows you are playing a game of diminishing returns. Businesses need to look at how they make their analysis as efficient as their data gathering. We are in a new era of data exploration that demands a jump in the scale and performance of analytics applications to achieve global scale analytics – what’s the point in collecting all that data if you can’t use it….?

Kevin Spurway is Senior Vice President of Marketing at MicroStrategy

Topics