Fog computing: What businesses need to know

You could be forgiven for thinking it’s something to do with improving weather forecasts, but in fact fog computing is a type of decentralised infrastructure. In a fog computing – sometimes referred to as edge computing – model, computing resources and applications are placed at the most logically efficient point, whether that’s in the data centre or in the cloud or somewhere in between.

The idea of this is to boost efficiency, return faster responses and to minimise the amount of data that has to be moved around. It is a key part of implementing the Internet of Things as computing, storage and networking can be placed anywhere along the line between the thing and the cloud. But while the main driver is efficiency, fog computing is sometimes also adopted to meet compliance and security requirements. 

The use of the term ‘fog’ is meant to convey the idea of cloud computing closer to the ground. You can think of it as a layer sitting between devices and a cloud or conventional data centre. Although you may not have heard of it, fog computing is already widely used, research company IDC estimated in 2015 that the amount of IoT data being analysed on devices close to the things themselves was close to 40 percent.

How it works

A fog computing environment network has two planes, a control plane and a data plane. The control plane looks provides an overview of the network structure. The data plane – sometimes referred to as a forwarding plane – determines what happens to the data packets as they arrive.

The data plane allows computing resources to be placed anywhere on the network. They don’t have to be at the centre on a server they can be distributed to desktop or mobile devices on the edge of the network.

For this to work the network itself will have a number of ‘fog nodes’ these receive data from devices in real time and will either process it directly or store it temporarily pending it being passed elsewhere.

This distributed approach to handling information is becoming more widespread as the number of Internet of Things devices increases. The reason for this is that IoT devices generate large amounts of data and transmitting all of this directly to a cloud service can consume large amounts of bandwidth and create problems with latency. If the data is part of a control system, for machinery for example, this can lead to performance problems and lack of responsiveness. Fog computing allows IoT data to be processed in a data hub or smart device closer to the sensor that’s generating it.

The key to using this effectively lies in prioritising the data packets and routing them accordingly. The most critical data is analysed on a fog node closest to the device generating it, the node itself can initiate actions that need to be carried out quickly, such as opening a valve or tripping a switch. Data that can wait a little longer is passed further up the line as resources and bandwidth permit to an aggregation node, examples of this might be in smart metering where data from individual meters would be passed to a local sub-station. Data that’s simply for historical or big data analytics or archiving will be passed directly to the cloud or a data centre as resources allow.

Development

A group consisting of ARM Holdings, Cisco Systems, Dell, Intel, Microsoft and Princeton University got together in 2015 to form the OpenFog Consortium. The consortium now has over 40 members and works to promote standards and the use of fog architecture.

In 2016 the consortium published a white paper setting out what it calls the ‘eight pillars’ of fog computing. These it defines as: 

*Security

* Scalability

* Open

* Autonomy

* Programmability

* RAS (Reliability, Availability, and Serviceability)

* Agility

* Hierarchy

Networking company Cisco is one of the leaders in developing fog computing with its IOx system which allows applications to be executed in the fog as well as offering secure links via the company’s networking systems. It also allows developers to work on fog systems using familiar open source tools.

Business benefits

The use of fog computing has the potential to impact a number of areas of business. With the rise of the Industrial Internet of Things, fog applications can be used to monitor and analyse data from sensors and if necessary trigger an action or an alert. This could be used in many areas, changing equipment states – such as opening or closing a valve in response to flow or pressure readings for example – or security tasks like triggering automated locks or zooming a surveillance camera. It can also be used in predicting failures so that a technician can be alerted to carry out a repair before a problem becomes critical.

So far fog computing has been used mainly in industries like manufacturing, oil and gas production, and utilities. It also has potential in transportation systems, such as automatic train control or road traffic management, and in delivering public utilities like electricity, gas and water.

All of this can cut expense by reducing bandwidth usage and offer improved agility via faster response to sensor data. It can improve security too by reducing the amount of sensitive data that has to be sent over the network.

Drawbacks

There are some concerns surrounding the use of a fog model. Many of these concern security, it’s easier to secure data when its kept together, so if sensitive information is distributed among many devices there are more points where it’s vulnerable. It’s therefore important that fog nodes are subject to appropriate security controls.

Another issue is that the fog adds a further layer of complexity which may make some enterprises wary of adopting it in the short term.

 Where next? 

As the number of IoT devices continues to increase so does the amount of data they generate. Gartner estimates there will be 26 billion IoT devices by 2020. This could lead to a need for big increases in network and data centre capacity.

Fog computing offers a way of distributing that workload in order to reduce the strain on infrastructure whilst delivering results more quickly. It also allows for easier and more cost effective scaling. For applications that require a rapid response time or for remote locations where fast networking isn’t available, fog computing offers a useful alternative to both the cloud and more traditional models.