Broadband’s identity crisis: The QoE challenge

In most industries, customers can choose what they want to buy using a rating system to govern their expectations. Hotels have stars, flights use different seating classes, and restaurants rely on critic reviews. Yet, surprisingly, the same cannot currently be said for broadband, despite the fact that in today’s online-centric world this has become arguably more important to the average consumer than all other rating systems combined.

Whether it’s fixed or mobile, download speed has been the core metric that service providers have used to market themselves for decades, and it’s easy to see why. High speed means high quality, right? Not always. Although there’s a place for speed as a leading metric for the performance of a broadband connection, the overall Quality of Experience is arguably just as, if not more, important. Yet QoE can be subjective, not least because the Internet is used for a wide variety of purposes; everything from Video on Demand to online gaming, and from VoIP to general web browsing.

Although a base level of performance is expected across the board, different subscribers will place greater emphasis on different things. A regular gamer will expect low latency, for example, while a heavy streamer will want a consistent data throughput to avoid degradation of the video they’re watching. This presents service providers with a challenge, especially as a single metric – like download speed – cannot possibly give all these different user groups an accurate expectation of network performance.

The modern definition of QoE

Rising traffic levels have also put operators under pressure, which means that tackling this problem depends on a number of different factors. First and foremost, service providers need end-to-end visibility into all new services and content platforms likely to cause a surge in demand on their networks, as it’s widely accepted that poor network performance, in the subscriber’s eyes, is linked to an increase in churn. However, it’s also important for service providers to know when and where a network surge is likely to occur. It’s only by having this level of actionable data that they will be able to start meeting the needs of today’s subscribers en masse rather than just serving a small subset of their overall user base.

Network intelligence and traffic management tools, supported by Deep Packet Inspection, are therefore vital for service providers to start reducing churn and, in turn, better target different user sets. Not only will these tools help to meet this goal, they will also be key for unlocking the data needed to make it a reality. After all, it’s only by using DPI tools that service providers can identify what traffic is flowing across their networks in real-time, prioritising traffic where appropriate, and also identifying where the network is congested in order to take steps to address it.

Scoring network performance

By putting a range of processes in place to better manage and monitor data traffic in this way, service providers will then be able to identify what the network is capable of delivering at any given time. In today’s hyper-connected environment, fuelled by data-led services, operators need a more granular view – one that offers insights into the service, application, and subscriber at any given time – which can be achieved by rating performance based on the QoE of services. This ‘scorecard’ system helps to drill down and reveal the root cause of service degradation and poor QoE based on analysis of network data. Once collected, structured, and assessed, data like this, traditionally used to inform the running of the network, can then be used by operators to score their own services, providing transparency and setting expectations for their subscribers.

Marketing teams can also use network scores to differentiate areas where the service provider is particularly strong. Take online gaming, for example. Here, continuous service is required to give a good experience as an intermittent connection will cause havoc and ruin a user’s ability to play. Rocket League is one such game that is primarily online, and links players from around the world to play against each other. The developers deemed latency as such an important aspect of the online experience that they built an in-game performance monitoring and alerts system. It’s designed to help players choose their server location based on what will give the lowest latency, thereby improving their experience.

Realising the network’s potential

Service providers should consider Rocket League as a lesson. After all, if marketed in the right way, gamers may be willing to pay a premium for an 'unlimited' A-rated gaming broadband package. Paired with DPI technology, this could be delivered to them across the network by prioritising online gaming traffic to users on that specific package. And this is just one sub set of a service provider’s overall user base.

Making all this a reality, however, depends on service providers choosing the right partner that can not only collect but also amass and structure network data so that it can be transformed into actionable intelligence, thereby empowering them to make informed business decisions that will ultimately improve QoE and better differentiate their offerings against the competition, while also catering to the unique needs of individual user sets. It’s only by adopting this approach that they will be able to tackle the next network performance challenge to arise, whenever that may be.

Cam Cullen, VP Global Marketing at Procera Networks

Image Credit: Shutterstock/asharkyu