Skip to main content

What would it be worth to your business to get your data 10 times as fast?

According to research by IBM, 40 zettabytes of big data will be created by 2020: that’s at least 300 times more data than was created in 2005. This article will examine how organisations can make big data work quicker for their business than their competitors.

The speed in which data can be transferred and analysed can often lead to an organisation creating a competitive advantage. Data acceleration is also crucial in the face of either an unforeseen human-made or natural disaster. The ability to rapidly restore backed up, and often encrypted, data is vital because it can enable product and service delivery to continue unabated. On the opposite side of the coin, organisations need the ability to back-up data fast and in real-time as a matter of life or death.

At present, with Hadoop and Apache Spark, organisations can analyse vast amounts of data. “With the ability to do this we can bring in data from IoT and the cloud to provide a greater in-depth analysis of the information”, says David Trossell – CEO and CTO of one of Gartner’s Cool Vendors for 2016, Bridgeworks. Think of data like this, he adds: “Data is the food that feeds big data and the more work you do, the more work you require.” Having the ability to analyse data is worthless unless you have the ability to get it as ‘the food’ to and from the place it is stored or cooked.

This problems means that a lot of food is still waiting to be harvested, but you can’t leave it alone. It has to be fresh to be at its best, and this is particularly true whenever data analysis is applied to business intelligence, the prediction of natural disasters, weather pattern forecasting or an analysis of another kind. This means the big data harvest and process has to be fast and efficient to get the data rapidly to the plate.

Fresh data, fresh analysis

“For that data to be relevant in this fast changing world it has to be fresh, and the more of it you have, the better”, he explains. Data variety is also a prerequisite for having a good diet. For analysis purposes this big data needs to come from a wide range of sources, including social media, market pricing, your organisation’s and other firms’ sales figures.

Data variety enables organisations to gain a balanced and in-depth view, which is essential but it also brings its own problems to the table. "That data may not be local but from the other side of the world in the cloud, or accumulated from IoT sensors, and moving that data over that distance will slow down the delivery of the due the effects of latency on the network”, Trossell explains. As data grows ever bigger, ingesting data could become much slower, and so slow that it becomes impossible to use. This scenario is the exact opposite of what everyone wants to achieve as it can lead to outdated data being analysed.

Trossell therefore warns us: “Trying to send all this data down slow links is going to put you behind your competitors that have links 10x or 100x faster than you. However, just adding larger links isn't always going to solve the problem. It’s like having large trucks but narrow roads. Wide Area Networks (WAN) and the cloud is full of these narrow roads killing off your network performance.” To return briefly to his food analogy, the impact of network latency can lead data to be half-baked and useless.

Security considerations

A survey of 1,000 IT professionals across six continents by Neustar also highlights the need to back-up data and to store it in more than one location: "The research results show that although revenue loss caused by a Distributed Denial of Service (DDoS) related outage is usually the main concern, 57 per cent of all breaches involved some sort of theft including intellectual property and customer data as well as financial.

"More troubling, following the initial breach, 45 per cent of organisations reported the installation of a virus or malware - a sign that attackers are interested in causing ongoing harm.” Such attacks can lead to lost data, and a lack of service continuity. Business intelligence and other systems therefore need to be securely protected to prevent downtime, lost opportunities to analyse big data, lost customers and reputational damage.

The headline findings from the study include:

  • 73 per cent (7 in 10) of global brands and organisations have been attacked, which should put virtually every organisation with a digital presence on notice.
  • 82 per cent of organisations experiencing a DDoS attack were then attacked repeatedly. In EMEA, 47 per cent of organisations have been struck more than 5 times.
  • More than half (57 per cent) of organisations reported theft after attack, including loss of customer data, finances or intellectual property. 50 per cent lost over $100,000 per hour and 42 per cent needed at least three hours to detect that they were under DDoS attack.
  • 76 per cent of organisations are investing more than last year in response to the DDoS threat.
  • 71 per cent of financial services firms attacked experienced some form of theft and 38 per cent found viruses or malware activation after an attack. With big money, customer trust and regulatory implications on the line, 79 per cent of financial services organisations are investing more this year than last.

Trossell warns: “You just never know how the next disaster strike can affect you no matter how you plan and asses your risks – something will always come out of the left field. Take the disaster of Marco Marsala as an example. He, in one mistaken moment of error, completely deleted has company. How? I let him explain in his own words. “I run a small hosting provider with less than 1,535 customers …..Last night I ran a small Bash script with rm- rf {foo}/{bar}.” This deleted all his servers and because backup servers were still mounted they got deleted as well!”

Best practice tips

Trossell therefore offers his top five best practice back-up and restore tips:

  1. Ensure that there is an air gap between live and backed up data to allow for data retrieval in the event of a ransomware attack and natural disasters.
  2. Identify threats and risks, people are finding new ways to threaten your data.
  3. Think data first, define the data you need to get back as this will define your back-up and restore policies as well as your infrastructure.
  4. Disaster recovery is everybody’s responsibility and not just IT’s. All senior executives should be involved, including CMOs. Don’t forget the buck ends with the CEO – get them involved as well.
  5. Practice, practice and practice.

Latency and packet loss

So data and operational security is as important an issue as being able to move data at speed across a Wide Area Network. But what is preventing data from being sent times as fast, and as fast as what? “The biggest issue about moving data from external sources across WANs is the way networks work against you in terms of latency and packet loss”, says Trossell. Fast networks can even be brought to their knees when the ‘road’ is too narrow, stopping your combine harvester from getting your product in from the fields to the factories for processing. Latency is this narrow road, and as Trossell says it’s a hidden performance killer, but it can be mitigated with solutions such as PORTrockIT.

“When the CMO complains of poor performance getting data on-board, the network administrator looks at the network and sees plenty of space capacity – so there’s no problem from their perspective, but this leads to a stalemate which leads to the new high speed WAN connection is going to waste”, warns Trossell. He says he see this issue occurring often, and subsequently advises marketers, data officers, and network managers to collaborate by working together as a team. At the end of the day this approach is about the way organisations and departments look at their resources, the data infrastructure and the link with latency across a WAN.

Be resourceful

“It’s about making the most of your investment in the high speed WAN by mitigating latency and packet loss – sweating your resources to maximise your competitive position”, he suggests while comparing it to being like every General on the battle field. “One factor that always affects their decision is the speed and the depth of information on which to base is next move to gain the advantage and win the battle such is their importance given to communication channels”, he explains.

Like Generals, Chief Marketing Officers (CMOs) need to use communication to understand what their competitors are doing and more importantly, what they are not doing. Similarly without communication, supported by technology, the Generals can’t make the right decisions.

Trossell concludes: “First you must understand the corporation’s recovery requirements and this will define your restore data requirements and in turn the infrastructure. However, whilst that air gap is important for data resilience it has a dramatic effect on the performance when restoring data over the WAN – it’s our old friends of latency and packet loss again.” So what’s to ‘Get Your Business Data 10 Times As Fast’ worth? It’s invaluable to gain a competitive advantage, to maintaining service and business continuity, protecting your organisation’s reputations and its revenue streams.

The Neustar report also demonstrates why data security has to be part of this equation, because without security customers can be lost, financial losses can be incurred and reputations can be damaged. It’s therefore wise to invest in solutions that permit fast and secure data transfers.

Photo credit: Sashkin / Shutterstock

Graham Jarvis
Graham is an experienced editor and journalist. He is the founder of Media-Insert Communications; the former editor of The Marketing Leaders, the Chartered Institute of Marketing’s Technology group’s e-magazine; and a former guest editor of BT (which is now known as and owned by SIFT Media).