With 2020 on the horizon, it’s clear that digital transformation has now outgrown its ‘buzzword’ status and is a fact of life for many businesses. Reports suggest that over 90 per cent of organisations in the US and UK are planning or currently undertaking these projects to gain a competitive advantage. However, the mundane reality is that many firms still aren’t getting much more than incremental improvements.
One innovation set to shake things up and open the door to a whole new world of innovation-fuelled growth is edge computing. Yet many firms are held back by complexity, skills shortages and legacy database technology. The answer may lie with edge-ready database tools that support rapidly evolving developer demands, and a return to prominence for IT architects, who increasingly are the key to making edge computing a reality.
Living on the edge
At the heart of digital transformation is the ability to give end users and customers unique experiences that help foster loyalty and ultimately drive operational efficiencies and profits. It couldn’t happen without cloud computing and the low-cost, on-demand, highly scalable compute power that it provides to developer teams. However, there’s a problem: digital transformation has become its own worst enemy. As the number of websites, applications, IoT devices and online services grows, so does the volume of data. According to Cisco, global datacentre traffic will triple from 2017 to 2021, to reach nearly 21 zettabytes annually.
The problem with all this data is that it’s clogging up the pipes that carry it to and from the cloud and physical datacentres. Organisations want access to more and more data to uncover customer insight, and want to present their users with more and more data to improve their mission-critical and revenue generating services and maintain a competitive edge. However, passing this even increasing amount of data back and forth between the edge and the core results in skyrocketing bandwidth costs and a larger exposure to the impact from latency and network outages. On top of all that are the security and compliance challenges of storing and transporting the data. This is where edge computing comes in.
Edge computing is a distributed computing model, where data processing and storage is carried out on the periphery of the network, closer to where it’s actually needed. This minimises the need to send it back and forth to a centralised server or cloud, reducing bandwidth usage and latency. Crucially, it allows for much faster decision making than a more traditional centralised computing model can afford, which makes all the difference when it’s applied to something time-sensitive – self-driving cars, for instance. If a self-driving car relied on cloud computing alone, it would need to send data up to the cloud or server and wait for the decision to be sent back down, regardless of whether the data was time-sensitive or not. With edge computing, the car can act on urgent items locally, so if a hazard were detected, the car can immediately make the decision to stop without waiting for data or instructions to come back from the central server.
The benefits of edge don’t stop there. Take Ryanair’s experience, for example. As one of the world’s biggest airlines, it handles the data demands of more than three million mobile users via its app. Edge computing allowed it to cut network bandwidth from its cloud provider by as much as 80 per cent after implementing edge-enabled database technology. Plus, since it had to pay for each byte transferred to and from the cloud, this amounted to a massive cut in operating expenses.
Edge also helps businesses avoid the threat of IT outages. This type of disruption can be particularly severe – Gartner estimates that a service outage can cost up to $5,600 per minute. Centralised computing networks may not be to blame when an outage occurs, but they make addressing them more difficult. If everything on the network relies on the ‘core’ to function, any downtime will be felt across the board. With edge computing, the outage is usually limited to the device in question, while the rest of the network continues uninterrupted. It’s evidence like this which led IDC to predict back in 2016 that by this year at least 40 per cent of IoT-created data will be stored, processed, analysed and acted upon close to or at the network edge.
The truth is that IT systems have undergone massive architectural change over the past decade as the emergence of cloud computing centralises compute, data and storage capabilities. As infrastructure also moved to the cloud, many saw this as the death knell for the traditional IT architect. Well, now they’re very much back in demand, as edge computing forces a rethink of the old architectural assumptions around cloud computing. Data is on the move again, from the centre to the edge, requiring an accompanying shift in infrastructure and the skills to understand how to manage this evolution.
Edge computing really comes into its own when you start to look at some of the companies already using it to drive value in a variety of use cases – from retail to healthcare.
One such company is SyncThink: a neuro-technology firm that uses eye-tracking metrics and devices to help improve medical assessments of traumatic brain injuries. The firm required an offline mode for environments like sporting stadiums where doctors sometimes need to conduct urgent assessments of athletes, but bandwidth is often patchy because of heavy mobile usage by fans. Edge computing supported by a mobile-ready NoSQL database, allowed the firm to offer offline capabilities and then seamlessly sync with the Azure cloud when sufficient bandwidth becomes available.
Also in healthcare, medical tech firm Becton Dickinson tapped the power of edge computing to optimise treatment for Type-2 diabetes sufferers. Medical devices and a patient app automatically collect real-time data on a patients’ insulin and glucose levels, activities, meals, and location, and then provide them with customised alerts and recommendations. Once again, the value of edge is in offering offline capabilities, to ensure the consistency of collected data, with secure synchronisation offered once connectivity is available again.
Another standout example is UK delivery service Doddle. More than 80 of its locations around the country suffered from patchy mobile coverage at peak times when customers saturated the network. The answer was an edge computing set-up to ensure its customers and employees always have access to its app-based services.
New complexity, new architecture
Yet with new opportunities come new complexities for organisations. Although nearly 15 per cent of European IT leaders in 2018 claimed to be already using edge computing, an even bigger number (21 per cent) admitted that it will take them more than five years to do so. Alongside the complexity of using multiple technologies (43 per cent), respondents cited reliance on legacy database technology (37 per cent), a lack of resources (36 per cent), and a lack of skills (33 per cent) as key barriers to adopting new digital services.
It’s no exaggeration to say that IT architects have a critical role today, sitting between the C-suite and development teams to unlock value from edge computing and help everyone to realise their ambitions.
Perry Krug, Architect, Office of the CTO, Couchbase