Skip to main content

Businesses face analytic challenges as data storage grows to 1.8 trillion GB

There's a common bond among all technology users: the accumulation of data. Recent data from analytics firm Infobright provides some astonishing numbers about the amount of data users collectively store and details common issues that businesses, even the SMB, must tackle when data starts to snowball.

For example, did you know that if digital data was broken into bits of info, there would be more data particles than stars in the physical universe? Or that we are currently storing 1.8 trillion GB of data stored across 500 quadrillion files?

There's no stopping this data agglomeration, and many businesses and IT departments are faced with the daunting task of analysing this information. The analysis of business data is crucial for running a successful business, and analytics can range from simply being able to accurately calculate profit and loss to having a good grasp on the demographic information of your most loyal customer base.

The challenge many are facing is how to tame and analyse that data as it grows at an astounding rate. It can become complicated to manage and cause performance issues on a network.

According to Infobright, when it comes to dealing with the complexities and performance issues of analysing big data most IT managers tend to:

  • Tune and upgrade databases, which can help but can also increase administration costs and licensing fees.
  • Upgrade hardware processing capabilities, which often increases overall Total Cost of Ownership.
  • Expand storage system; again, another rise in cost.
  • Archive old data, which is particularly problematic as this reduces the amount of data available to analyse at any one time, which can mean less accurate analytical reports. Upgrade network infrastructure, increasing costs and increasing network complexities.

Managing growing data in a business environment while keeping cost and complexity low is what many businesses are trying to tackle. Infobright and many other data analysis vendors suggest the answer is not so much in adding hardware but in approaching the storage of data in different database formats - such as storing data in columnar databases rather than traditional row-based databases. Because less data is returned when searching against a columnar database, analytics and searches can be performed much faster than with row databases.

Large data sets are posing new data management issues, even in small businesses. IT and data owners must find creative and cost-effective ways for their companies to work with large amounts of data.