There is a lot of fanfare around edge computing and rightly so. Gartner estimates that within the next two years over 50 percent of enterprise data will be created and processed at the edge. However, when you break it down, it’s essentially a localized cloud infrastructure that can be delivered to a specific location, to support a specific use case. Which puts businesses with a hybrid cloud strategy already in pole position to begin moving towards the edge. Or rather ‘cloud edge’, which is the more accurate term. If that strategy happens to be based on an open framework then even better; because that gives them the freedom to select the best third-party resources available to them to underpin their cloud edge proposition.
Enterprise CIOs and CTOs have enough on their plate these days without being charged with adding yet another layer of sophistication to the IT stack to support cloud edge computing. Despite this many businesses are now gearing up for the move. However, in the absence of a standard cloud edge computing model, large enterprises are looking for a helpful steer in the right direction.
Reaching the edge
Let’s begin by looking at which industries are expected to adopt cloud edge and why. Manufacturing, automotive, retail and healthcare are among the first movers. Having the ability to deliver applications and services with ultra-low latency is crucial to businesses with physical supply chains, production lines and the need to provision tens of thousands (eventually millions) of sensors and connected devices. Edge computing provides the mechanism to process data and compute power as close as possible to the end user. Which has obvious applications for heavy industry and transport, but with the number of sensors and connected devices set to rise exponentially, cloud edge will soon become integral to every IT and cloud network.
The cloud edge isn’t a fixed space; it's sporadic in nature, it ebbs and flows based on different use cases, scenarios and peaks in demand. As such it needs to be robust, scalable and smart. It also needs to be efficient. If the cloud edge is going to become a strategic asset, then it needs to be integrated with an organization’s existing cloud infrastructure. Not based on a standalone solution.
This is one of the key reasons why businesses that are already managing multiple environments, based on a hybrid cloud strategy are well placed to succeed.
A distributed cloud architecture enables businesses to extend their capabilities out towards the cloud edge. They can do this on a gradual basis, testing the resilience and reliability of their new cloud edge function as they go. This is based on a unified model that allows businesses to build applications and run them anywhere across their distributed cloud infrastructure. Using a centralized system, developers and IT operations teams can launch new applications and then deploy them to any specific virtual or geographic location.
The container effect
This entire process is underlined by using containers. Without them it would be almost impossible to achieve the scale and reach associated with cloud edge computing. Containers are smaller and far more nimble than virtual machines. It would be possible to fit dozens of containers in spaces that could only facilitate one or two virtual machines. Containers also provide a common layer of abstraction. They can run anywhere across a distributed cloud infrastructure; fluctuating between different points but capable of delivering specific functionality at the cloud edge. This flexible model will allow businesses to scale their edge capabilities over time to manage larger numbers of sensors and devices, while accommodating new use cases.
A container-based architecture also facilitates an integrated and centralized management system that allows businesses to orchestrate, provision and secure applications. Having that unified model in place means that businesses will have complete visibility across their cloud edge properties. The cloud edge should be a natural extension to an enterprise IT and cloud infrastructure. It will be most effective if it’s based on the blueprint of an existing design. Then it can share protocols, systems, data and intelligence that are intrinsic to the entire network.
Insights at the edge
Further advantages include being able to integrate the cloud edge with central business systems and data analysis tools. In the case of provisioning and managing huge volumes of connections, development and IT operations teams will be able to run tests before rolling out container-based applications across a vast network of devices. The smarter the cloud edge the better, as it will allow teams to spot faults or abnormalities. Teams will want to minimize risks due to the mission critical nature of applications and services supported by the cloud edge.
It would be prudent for businesses to be able to identify issues before they escalated. With an insights engine in place it will be possible to continuously collect and analyze data that is pulled in from across the distributed edge properties. Using a process of AI and machine learning the data will eventually feed into predictive analytics that can register unusual activity and then act upon it, notifying the correct channels.
The open framework
From a cultural perspective taking ownership of the migration to cloud edge helps businesses to improve and retain their IP. The developers, IT and network operations teams that are already immersed in hybrid cloud will be deployed to help the business realize its cloud edge ambitions. The lessons they learn along the way and the knowledge they gain will remain inside the business providing it with the template and design for cloud edge. They will become trailblazers and their work will attract other developers that maybe hadn’t considered a career with a particular business until it started to demonstrate its cloud-native credentials.
Having an open framework for developing a cloud edge proposition brings several benefits, not least the ability to ensure a gradual and methodical transition to the edge. It also allows developers to participate in open source projects to accelerate innovation in what is still a nascent space. Crucially, it gives businesses the freedom to partner with any third party from across the IT stack that will help them to develop a fully rounded cloud edge solution. They can partner with the right communications service provider, to support any 5G capabilities, or connect with any number of cloud providers. They’re not tied to a specific vendor or technology so they can develop their own cloud edge roadmap.
Edging into the future
Cloud edge is here but businesses have only begun to scratch the surface. To date most of the use cases that fall under the cloud edge category are driving IoT solutions, but it has the potential to deliver so much more. It has applications for financial services companies to support trading. Besides, applications for VR, media and entertainment and cloud gaming. Developers have an entirely new platform and environment to play with, able to develop fresh applications and services built on a fluid and constantly evolving infrastructure. It will support new business models, drive new categories and new economies of scale. Businesses need to take note.
The path to the cloud edge is already set for those businesses with a hybrid cloud strategy. They’re already working with containers, which have the ideal application architecture for the cloud edge. The open framework enables them to do this on their own terms, to select the right partners from across the cloud and edge computing ecosystem that suit their technical and business requirements. It ensures a unified approach so they can build a cloud edge that is firmly rooted in their own enterprise IT and cloud infrastructure.
Martin Percival, Solutions Architect Manager, Red Hat