Skip to main content

Eyeballs at the edge: The future of video analytics

(Image credit: Future)

During my career I’ve been fortunate enough to speak with leading experts from a range of tech industries. Whether they work in developing cloud applications, making smart cities a reality or creating semiconductors, the pressing question on their minds is which new technologies are going to play a significant role in the future. For all these tech innovators my answer is always the same: eyeballs at the edge.

While this may immediately conjure images of the increasing number of video cameras that are populating our buildings, campuses and cities, this does not, in fact, refer to physical security. Instead, eyeballs at the edge simply refers to the use of video analytics in combination with edge computing. This name is derived from how, across the numerous potential use cases, this technology is performing relatively simple image recognition tasks using a video stream source. In other words, tasks that could be performed by the human eye. What makes this technology so exciting, however, is that the underlying infrastructure enabling these capabilities is currently experiencing an inflection point in price-performance, size, throughput, and power consumption. Chips that were previously seen as CPU intensive, which then became high-power specialty versions of a GPU chip, are now far more efficient, cheaper, smaller and less resource-intensive. Importantly, this allows for chips that can be easily attached to PCI and USB connections.

Keeping your eye on the prize

To illustrate what these developments in chips and infrastructure mean in practice, let’s look at how eyeballs at the edge could be implemented in healthcare. For an industry that is constantly under time pressure, it seems inefficient that medical staff spend copious amounts of time simply locating portable equipment. However, with eyeballs on the edge, hospitals would easily be able to keep track of mobile X-ray machines, ultrasounds or infusion pumps as they move around the building. In an effort to address this problem, medical organisations have invested in different wireless technologies to keep track of these devices. However, while they are of use, they frequently fail to offer the precision and low-cost required by institutions.

Ultra-wideband is another technology that has potential in addressing this issue but it is really eyeballs at the edge that has the most promise. With simple video analytics, eyeballs at the edge can be taught to identify medical devices and track them as they are transported around hospitals just as a human observer would. The edge component is also fundamental here in allowing for close-to real-time asset location in far greater time than typical cloud capabilities could provide. Moreover, in a typical cloud set-up, the cost of simply transferring the data generated by dozens of video cameras to a distant server would be prohibitive.

In practice, this enables the hospital to track their ultrasound machine as it is moved several doors down the hall without the need for copious man hours, cost or time. From this example, it is clear how edge computing is integral in making video analytics increasingly accessible and affordable for organisations, in healthcare, or otherwise.

In the public eye

One area where we are already seeing eyeballs at the edge in use is stadiums. For instance, to keep track of bathroom queues and monitor when wait times are becoming excessive and, in turn, activate relevant signage, basic video analytics are already being used. However, there is also scope in stadiums to deploy more complex systems. For instance, through tracking customer behaviour as they move around the stadium’s shops, relevant staff will be able to tell whether the customer is interested, passing through or in need of assistance. These sort of rapid, real-time insights into customer behaviour are only enabled through the phenomenal rate at which video analytics algorithms are currently developing. However, just as edge computing is necessary to facilitate the use of video analytics helping hospitals, it is also necessary here to avoid the cost of computing this video data in the cloud.

With video analytics now capable of telling the difference between a group of friends walking home at night and two people having a fight in a crowd, the number of potential use cases extends far beyond the stadium or hospital. The power of the algorithmic age is seen as a concern for some who find it hard to escape the idea of ‘Big Brother’. However, as the underlying technologies develop, their promise to improve quality of life increases too.

This trend is largely being driven, as with any inflection point, by a step function change in the capabilities of the technology. In this instance, the proliferation of custom video AI accelerator chips coming from Intel (such as their Movidius chip), Google (such as their small TPU chip), or Mythic (with their mixed digital and analogue approach) has been instrumental. These chips, in addition to video analytics capabilities, have also allowed for a smaller form factor and far superior price, performance and watt solutions that can all be delivered in an edge computing server.

With the rate at which these developments are unfolding, the number of potential use cases is set to increase exponentially. As such, it is only likely that we will see more eyeballs at the edge in our daily lives. Whether that be in your nearest hospital, stadium or retail outlet, enhanced video analytics are something to keep our eyes out for.

Eric Broockman, Chief Technology Officer, Extreme Networks