Managing data and network complexity in the year ahead

The past ten years have seen a large portion of enterprise IT budgets directed towards attempts at collecting, securing and analysing huge volumes of big data.

More connected devices are being added to the arsenal of gadgets we carry with us – just consider how many smartphones, tablets, smartwatches and fitness trackers were unwrapped over Christmas.

In addition, efforts to analyse data in real time, and the growing Internet of Things (IoT) trend, will require businesses to focus more on operational technology (OT) over traditional information technology (IT).

As a result, the increasing complexity of networks, and the sheer number of connected devices constantly streaming data over those networks, will lead to a paradigm shift this year in how businesses secure and manage their systems and information.

Reducing CAPEX and OPEX with SDN

When you consider that more employees now work in remote locations than in the main office of a typical business, it’s not hard to understand why the network has become more complex and difficult to manage than ever.

Fortunately though, for users and IT teams alike, Moore’s Law is commoditising the dedicated switching hardware space.

Traditionally, building a network requires a business to buy a lot of hardware and then over-provision by installing more bandwidth than is needed, leading to an explosion in OPEX (operating expenditure).

However, a new architecture known as software-defined networking, or SDN, is beginning to gain real-world traction. SDN enables the system of routers and switches spread out across an organisation’s network with a hyper-converged infrastructure that projects apps and data from the data centre.

SDN represents a transformation from costly dedicated hardware systems to inexpensive computers and mobile devices all featuring low-cost processors.

Without the need for the same amount of dedicated hardware, businesses will soon see a dramatic reduction on CAPEX (capital expenditure) and OPEX.

Indeed, we’ll see more compute companies trying to move into Cisco’s space this year to meet enterprise goals of OPEX and CAPEX reduction.

Big data is coming down the pipes

Businesses today are faced with the new challenge of turning the huge piles of big data they’ve amassed into actionable intelligence; a task made even more difficult when the data keeps on coming.

The analysis of big data requires looking at historical data and analysing patterns over long periods of time. Typically, this process has all been about trying to examine volumes of information for nuggets that can be used to improve business processes, leading to conversations around storage metrics such as how many petabytes or exabytes a company is trying to manage.

This year, however, while analysing big data will remain a priority, the focus of these conversations will shift from Hadoop and the amount of data stored, to examining multiple pipelines of data streaming in from specific sources - particularly as the IoT becomes more disruptive for businesses.

Sensors on connected devices will pinpoint where data is coming from and who is using these devices, in a similar way to the real-time web analytics used by marketers. Smart appliances, for example, such as thermostats and water heaters, can transmit data back to their manufacturers on how they are being used by homeowners. A smart car can send data to its maker about its operation, location and environment, enabling the manufacturer to push out software updates that will enhance the car’s performance and the driver’s experience, and even avoid problems before they occur.

Businesses will focus more on analysing these data pipelines providing real-time information directly from source, before it’s moved into big data lakes.

Here it will be used to provide companies with the ability to identify long-term trends which can be used to improve sales, enhance customer experience, and grow the business.

The IoT means OT over IT

Gartner defines operational technology, or OT, as “hardware and software that detects or causes a change though the direct monitoring and/or control of physical devices, processes and events in the enterprise.”

As more and more connected machines communicate and share data with each other, people will begin to be left out of the process entirely. We’ll start to see the priorities of IT departments shift away from people and move toward the growing number of IoT devices, placing more emphasis on OT than on IT.

For technology professionals, this will represent a new kind of accountability, scale and technical challenges.

As enterprises implement a number of IoT machines and automation technologies to make their businesses hyper-efficient without having to add more people, there will be infrastructure issues that go far beyond email going down or the Internet being inaccessible.

Whereas an IT environment requires someone to monitor the status and operation of all machines to ensure they’re operating at peak efficiency, machines in an OT environment will monitor themselves, taking proactive measures to prevent problems, and telling people how to fix them.

Rather than thousands of employees, an OT professional will instead service millions, perhaps even billions of devices. As OT, not IT, starts to become the centre of attention, we’ll start to see the true “rise of the machines.”

Stu Bailey, founder and chief scientist, Infoblox

Image Credit: Shutterstock/hywards