Skip to main content

Five edge computing challenges enterprises face and how to overcome them

network
(Image credit: Image Credit: Flex)

The amount of data created, captured, copied, and consumed has significantly increased as a result of video becoming more important - from CCTV to autonomous vehicles and unmanned aerial vehicles - to edge operations. With a plethora of data being produced and the inability to be backhauled, compute is then pushed to the edge of the network to pre-process it. However, problems arise when numerous devices transmit data at the same time. Sending the abundance of device-generated data to either a centralized data center or the cloud can create bandwidth and latency issues. In order for a digital transformation program to be successful, it is key to have the ability to harness the power of data from anywhere. This has resulted in the rise of a more efficient alternative - edge computing. 

Edge computing, as defined by Gartner, is ‘a part of a distributed computing topology in which information processing is located close to the edge, where things and people produce or consume that information.’ The idea behind this model is to process data with lower latency necessary for many new applications, while saving network cost. Edge computing has the ability to halt the transmission of irrelevant data to the cloud or data centers and send only the relevant, actionable data needed. It brings computation and data storage closer to the devices where it’s being gathered, instead of relying on a central location that can be hundreds or thousands of miles away. By 2025, close to 75 percent of enterprise data will be handled by edge computing nodes. 

However, despite IoT and its networking potential emerging over a decade ago, organizations still face various obstacles on the way to full-scale adoption. The challenge enterprises face with intelligence gathering is that there is an overlay of complexity that causes many programs to fail early on. 

Five key edge computing challenges  

Today’s edge architecture often comprises disparate compute systems, connectivity, and storage, which is a logistical nightmare. The deployment requires experienced IT staff every time a site or device is to be connected. Ultimately, this leads to an escalation in costs, delays, and unfortunately, the closure of many edge projects. 

Outlined below are five specific challenges many enterprises face when it comes to deploying edge intelligence: 

- Inefficient use of bandwidth: If an organization has a number of devices collectively producing a lot of data, the organization will likely want to store that data in the cloud, but sending the raw data to the cloud directly from edge devices can be difficult. Typically, businesses grant higher bandwidth to data centers and lower bandwidth to the endpoints, but edge computing is driving the need for more bandwidth across the network. 

- Speed bottlenecks: Organizations prefer connectivity networks, such as 5G, DSL or satellite, that prioritize throughput from cloud to edge since most applications work this way, whereas edge networks want to push data in the other direction. As a result, uplink speed can cause a bottleneck. If using a centralized cloud to store data and the cloud goes down, the data becomes out of reach until resolved, resulting in potential loss of business.

- Limited data controls: Since edge computing only processes and analyzes partial sets of information, many enterprises tend to lose out on valuable information.

- Ineffective security/privacy: 66 percent of IT teams see edge computing as a threat to their organizations. When data is being handled by different devices, it might not be as secure as a centralized or cloud-based system, making it crucial for understanding the potential security vulnerabilities around these devices, and ensuring the systems can be fully secured.

- No Container/microservice support: Unlike a traditional cloud container, edge containers can be deployed in parallel to geographically diverse points of presence (PoPs), but with many containers spread across many regions, careful planning and monitoring is required. 

Being open to open architecture 

The most efficient edge solution integrates the three components, consisting of compute, connectivity and storage, but as a single box. Currently, edge solutions focus on specific application use cases, which are ideal when no other system is in place. For edge to become the new standard, it needs to offer the flexibility to support new systems and encompass legacy systems.

The best way to overcome these hurdles is to employ an open architecture platform that reduces technology sprawl, potential security weaknesses and exposures, as well as costs. Minimizing system architecture complexity is a key. The fundamentals of an open edge architecture are modularity and openness. It’s imperative to have the ability to connect to any network or communication device interface, such as cellular, Wi-Fi, LoRA or GPS, as well as have the ability to run multiple unique software stacks as a homogeneous entity, including firewall, machine learning, telemetry, or data analysis. 

The key functionality of an open architecture that help mitigate the above mentioned five challenges of edge deployments include:

- Virtualization: Remove the need for disparate compute resources. A single compute platform can be shared and segmented for the secure running of all applications, both new and legacy. Legacy applications in particular save development money and time, and provide uniformity across networks, so virtualization is key. 

- Ruggedization: Choose a hardware partner that offers the necessary compliance and security certifications, and can perform well in any environment.

- Multi-Carrier Support: Minimize exposure to loss of connectivity by working with multiple bearer types, such as multiple public and private mobile network operators, Wi-Fi providers and Satellite providers.

- Private Connectivity: Implement an overlay network. Irrespective of the backhaul provider, a secure path to cloud and SaaS applications is always available.

- Principle of Least Privilege: Implement security protocols that segment data, control, and administration planes. For example, a data scientist has access to machine intelligence, IT can manage security, and your CIO can manage the edge node.

Simplifying edge roll-outs 

Edge has the ability to unlock possibilities for quicker, less costly, and more secure operations across industries, especially autonomous vehicles. However, as outlined above, the roll-out of the intelligent edge can be challenging. That’s why choosing to adopt an open architecture model approach is so critical. With an open architecture, it is possible to demonstrate the ease in which the edge can be rolled out securely, efficiently, and effectively to advance innovation across a range of industries.

Frank Murray, Chief Technology Officer, Klas

Frank Murray is Chief Technology Officer of Klas. Since joining the company in 2001, Frank has grown the R&D department from a startup to a multi-functional team with hardware, software and mechanical capabilities of the highest standard.