It has been well publicised by global analyst giants, IDC and Gartner, that the IoT space is predicted to explode in the years up to 2020 and beyond. While estimates on the numbers of smart, connected devices may vary, the one thing everyone can agree on is that we’re looking at tens of billions of new connected devices accessing the Internet by 2020. Naturally, with billions of new connected devices accessing networks and the Internet, questions around data storage and security are front of mind.
With the disparate nature of the IoT and the variety of sectors it serves, the devices themselves have the potential to be located (literally) anywhere around the globe: from a smart meter in a home, or a sensor on a remote weather station, to devices in the stomachs of cows alerting farmers when to milk the heard, connected devices know no bounds in terms of locations or applications.
The variety of these locations, however, does present a serious question for IT teams and businesses regarding the processing, management and security of the data created by these devices. Traditional IT approaches are not applicable to the IoT in most cases, so this calls for a rethink about traditional, centralised IT networks and controls. Increasingly‚ edge computing will come to the fore when we talk about the IoT, enabling large quantities of data to be managed beyond the confines of the corporate IT boundaries.
However, security will continue to remain a significant challenge. While cloud can provide the flexibility and compute power to enable software patch updates on devices, the pure number of connected devices in play will mean that compromised hardware will remain an issue. Simply put, the bigger your network, the more possible entry points there are for hackers looking to exploit it.
How to deal with an unpredictable IoT data stream and avoid exploitation
IoT data is mostly unstructured, and therefore can easily be stored in public cloud infrastructure. All the major cloud providers offer low-cost scalable storage systems based on object storage technology. With high-speed networks and no charge for data input, public cloud is a great location to store the volumes of IoT data being generated by businesses.
But, the public cloud has more to give. Cloud service providers (CSPs), have extended their product offerings to include big data analysis tools that ingest and process large volumes of unstructured content. This allows businesses to create highly scalable ML/AI applications to process data more efficiently than in a private data centre.
Sensor data from Industry 4.0: understanding the how and where
Typically, IoT devices are seen as individual, remotely managed and embedded appliances such as cameras, but this isn’t always the case. Many businesses have distributed environments that run one or more servers at branch locations to monitor building access, environmental controls or other tasks that relate directly to the business itself. As a result, IoT is a mesh of devices that could create, store and process content across numerous physical locations.
Distributed data is the information created outside the corporate data centre or network. We are increasingly seeing the term ‘edge’ used to describe computing and data management tasks performed outside core data centres. Although edge computing has existed for many years, the current evolution in IoT and edge computing is notable for the sheer volume of data created in non-core data centre locations.
This brings unique challenges to IT departments that must ensure this data is adequately secured, collated and processed. Most IT organisations are used to knowing exactly where their data resides. With IoT, the challenge of putting arms around all of the content owned by a business is much greater, with obvious implications on user privacy and regulations, such as GDPR.
With the possibility of so much information being created at the edge, it’s impossible to move the data into the data centre for processing in a timely fashion. First, with a wide variety of devices deployed it may be simply impossible for a business to move the data into the data centre without investing heavily in external networking.
Second, in many instances the value of the data may not be best served by storing everything. For example, a camera that counts cars passing a traffic intersection doesn’t need to store the entire video, just report back the number of cars counted over specific time periods. The video data could be moved back at some time in the future or simply discarded.
A third point to consider is the timely processing of data. IoT devices may need to make local processing decisions quickly and not tolerate the latency of reading and writing the data into a core data centre for processing to occur. This distributed data and processing requirement means that businesses need to add the capability to push compute and applications to the edge and, in many cases, pre-process data before it is uploaded to the core data centre for long-term processing.
IoT in a nutshell: why is it critical and what makes it difficult?
The data that connected devices produce is the key to the value that they can deliver. In the truest sense of the Internet of Things, data streams can and should be accessible in real-time, thus providing companies with the latest information from which they can pivot and adapt. Consider this in an Industrial IoT (IIoT) scenario: real-time data could influence the production cycles of heavy industry, thus minimising downtime in a blast furnace for example, in the process saving material costs and maximising cost efficiencies.
If you consider the potential importance of data created in by IoT devices then you begin to understand the central importance of being able to access it, and even more so, the need for it to be 100 per cent reliable and secure from external ‘tampering’.
Predicting business requirements is key to any company’s forecasting plans and, ultimately, bottom-line ROI, whether it’s a publicly-listed, established multi-national, or a fast-paced, edgy, new disruptor. With the digitalisation of markets, an increasingly crowded competitive space, and a seemingly endless stream of new data sets and tech hitting the decks, it can often feel for businesses (big and small) that they are swimming against an unrelenting tide of technology.
However, we should not view these challenges as insurmountable. While technology will continue to inevitably advance, for those who meet this challenge to stay relevant and ‘embrace the chaos’, they will actually find that out of the chaos comes clarity, and out of clarity of mission comes a competitive advantage.
IoT represents a sea-change in the way companies think about IT and business approaches, but there is rich reward for those who stay the course and navigate the challenges around the physical management and security of the data – data that represents the unique value-add of IoT to both businesses and consumers alike.
Marc Lucas, Systems Engineer, Commvault (opens in new tab)
Image Credit: IT Pro Portal