Here comes the Fog: A solution for handling big data

The big data revolution is slowly creeping up on us, leaving us with no choice but to adapt and prepare for the impact it will have on our lives. From devices such as wearable exercise bands to driverless vehicles, everything in our lives is becoming increasingly digitally connected.

These connected devices cause an explosion of data packets. According to a report by IDC and EMC, the digital universe is doubling every two years, and will reach 40,000 exabytes (40 trillion gigabytes) by the year 2020. There are two key assumptions as to what has caused this growth of data. The first is that businesses are continuing to digitally transform their services in order to gain a competitive advantage, while the continued proliferation of Internet of Things (IoT) devices is cited as the second.

IoT devices and applications are being deployed at a staggering rate from a myriad of global endpoints. According to an ABI research study from Verizon, the number of IoT devices are predicted to expand from 1.2 billion in 2015 to 5.4 billion connected devices globally by 2020. This high growth forecast means extremely large amounts of data will need to be transmitted, processed and stored properly to ensure end users and customers reap the benefits of their applications. The terms for these large volumes of data is ‘Big Data.’ According to Gartner, Big Data is ‘high volume, high-velocity and/or high-variety information assets that demand cost effective, innovative forms of information processing that enable enhanced insight, decision making, process automation.’ So what’s the most effective way to manage and process large and complex big data? For Frost & Sullivan analyst, Lynda Stadtmueller, fog computing is the solution.

What does fog computing mean?

Fog computing, or edge computing as it’s sometimes known, has gathered a lot of traction recently. The phrase, coined by Cisco in 2013, is a term used to describe a computer and network framework for IoT applications, although the framework is not exclusive to IoT. There are other latency-sensitive and data intensive applications which can leverage fog computing architecture. An example of this would be when data triggers a local action that is time-sensitive, such as when building sprinkler systems are set off by heat sensors.

Frost & Sullivan’s analyst report titled, The Fog Rolls In; Network Architectures for IoT and Edge Computing, describes a particular computer and network framework designed to manage extremely large amounts of data by IoT.

It’s important for enterprises to seize on the most suitable methods to manage large amounts of big data. An IT infrastructure which is not suited to handle this can often experience data congestion, delays, slower service and high costs which have a direct impact on end users.

As the use of IoT devices continue to increase in popularity, there will continue to be an uptick in data traffic. This traffic is processed through data packets which travel over multiple connection ‘hops’ between source and destination points via the public internet. The internet is congested with other data packets which can cause latency issues that negatively impact performance. This collection of data in one location also opens the data to loss, damage and potential cyberattacks.

Clearing the Fog

A fog computing framework can help prevent unwanted infrastructure scenarios by splitting workloads among local cloud environments, where different ‘things’ (i.e., sensor-equipped, network-connected devices) quickly transmit data to locally deployed ‘fog’ or ‘edge’ nodes, rather than communicating directly with clouds. Following this process, a subset of non-time sensitive data is forwarded from the fog nodes to a centralised cloud or data centre for further analysis and action.

For example, a smart meter in a home which can relay the critical data through ‘fog’ or ‘edge’ nodes will save networks from unnecessary data traffic on the public internet.

The future of handling big data

What does a successful deployment of this framework look like? Best practices for a successful enterprise IT strategy should be tailored to the needs of the specific organisation. Fog computing is a framework comprised of network, system and cloud providers working together to create a solution comprised of edge equipment (i.e., network connectivity, processor capacity, security, management and analytics platforms) and software (i.e., management, monitoring, security and analytics software). These are based on open standards that enable seamless data sharing and processing between edge devices and the cloud.

As the sheer scale of data required to manage big data and keep IoT going multiplies, some organisations have attempted to set up fog computing themselves, only to be faced with a far more costly, complex and time-consuming process than they anticipated.

At Equinix, we believe that the best deployment of a fog services framework happens through a third-party provider which deploys and manages fog nodes for enterprises.  Some of the common enterprise challenges to deploying a fog computing framework are similar to those faced by organisations with a siloed private data infrastructure. That is why we have created a blueprint which provides a framework for building mesh hybrid IT environments in which data produced by people, locations, and clouds can be interconnected over high-speed, low-latency connections securely and in a cost-effective manner. We call this blueprint the Interconnection Oriented Architecture (IOA). It offers all the benefits of fog computing as well as being applicable to all types of non-IoT functions to bring data to the edge.

Big data is not going anywhere soon. Businesses of all sizes will be using some form of data analytics to impact their business in the next five years, and the organisations who will survive in today’s economy will be infused with digital services. By 2020, at least a third of all data will pass through the cloud. Therefore, enterprises need to ensure that their IT infrastructure is equipped to handle the era of big data.

Before joining Equinix in 2014, Klaas worked as Solution Architect at Tata and NTT Communications, designing Global Wide Area Networks. He has 16+ years working in various telecommunications roles including sales and engineering.