Skip to main content

The impact of mobile: Is your network ready for the wireless working revolution?

Mobile technology is revolutionising the way we work. A few decades ago, computers were tied to desks, so we could only effectively work in the office. Then laptops allowed us to take our work home with us, or on trains. When wireless networking arrived we became able to remain connected on the move – anywhere in the office, in coffee shops and hotels, and around the house. Now that smartphones and tablets have joined the wireless arsenal, however, WLANs are becoming crowded. Add to that the increasingly bandwidth-hungry applications we are running on our devices, and you have the recipe for a frustrating experience.

The most obvious hog of bandwidth is video. The major video streaming sites now support HD, and a 720p stream will take close to 6Mb/s, whilst 1080p calls for at least twice that. Watching live TV online requires similar amounts of data throughput. So extensive consumption of video can put stress on a wireless network, so that the video streams stutter and other applications become sluggish. Extensive use of cloud services can also put a load on your WLAN, and a virtual private network (VPN) will use a varying amount of bandwidth. If you are watching video on your VPN desktop, much more data will be required than if you are using relatively static office applications.

Any kind of virtual desktop or cloud application will be sensitive to variable network service, and this is not just a matter of bandwidth. We are used to essentially immediate response from our desktop applications, so anything less than this from software perceived to be in this category will be perceived as frustrating. When using a VPN or cloud application, if there is regular sluggishness users will want to switch back to locally-installed software instead. When network computers had a resurgence at the tail end of the 1990s, one of the factors that prevented their widespread adoption was the lack of responsiveness compared to regular office desktops. The centralised management advantage was far outweighed by the disadvantage to everyday usability.

Although the technology of VPNs, cloud desktops and cloud applications has moved on considerably since the renewed interest in network computing of 15 years ago, the fact that these kinds of services are now being delivered over wireless networks has brought additional constraints. Whilst a wireless network may in theory be rated well beyond the levels required by any of these activities, in practice the reliable bandwidth could well be an order of magnitude lower. Another key factor here is latency. Strong bandwidth won't be so useful if the flow is sluggish to start. Wireless networks can be 10 times more latent than wired networks, and this can be further accentuated when there are lots of devices on the WLAN, sharing the throughput and negotiating their slice of the channel.

You probably won't be getting the specified rate out of your Wi-Fi, either. A WLAN access point specified as 802.11n can theoretically run connections at up to 600Mb/s, when all four MIMO streams are utilised. This also requires the use of double-width 40MHz wavebands, and the congestion of the 2.4GHz spectrum in most urban areas will often make it hard to find a clear 40MHz band. The earlier 2.4GHz Wi-Fi standards specified up to 14 channels in Japan, up to 13 in Europe and most other countries, and up to 11 in the USA. Each channel is only 5MHz apart, though, so with 20MHz bands only three non-overlapping channels are available, and in 40MHz mode only Channel 3 can be used. So if there are multiple access points in the area, there's a good chance that 40MHz mode won't be available, and even 20MHz slots could be congested.

Wireless networking can fall foul of its own backwards compatibility, too. Although 802.11n has been around for a while, there's also a very good chance that there are still devices using the older 802.11g standard on your network. This will drop bandwidth down to a theoretical maximum of 54Mb/sec, unless you have a dual-band access point or router, or a secondary access point to service legacy devices. But the latter won't help the frequency congestion problem, only add to it.

Another issue with wireless networking is range. A typical access point will have a range of around 50m indoors, and less than 100m outdoors. A WLAN operating at 5GHz, such as the older 802.11a standard, is more susceptible to obstructions than one using 2.4GHz, and hence has a range of about a third of this. As devices approach the range limit, performance will drop noticeably, particularly if there are obstructions in line of sight. Although the MIMO (multiple input multiple output) technology used by 802.11n and above has improved performance considerably in cluttered spaces, there are still plenty of reasons why a Wi-Fi access point will never come even close to its theoretical maximum.

The most recent 802.11ac standard looks like it should solve this bandwidth issue. It can provide up to 6.77Gb/s of aggregate bandwidth, or 11 times the throughput of 802.11n, because it supports twice as many MIMO streams – eight rather than four, although the first generation of 802.11ac only offered three – and 80MHz channels, which can be bonded into 160MHz blocks as well. However, the top performance requires multiple antennas to be used on both the access point and device, and many devices only support one antenna, giving a theoretical maximum of 433Mb/s on a single 80MHz band.

The 802.11ac standard's ability to bond two 80MHz channels into 160MHz doubles bandwidth. So even a single-antenna device supporting this feature will have a 867Mb/s theoretical throughput, and quad-antenna devices could provide as much as 3.39Gb/s. However, as with a 40MHz dual channel for 802.11n, the 160MHz option with 802.11ac means only a few access points can coexist. Only two 160MHz bands can fit in the waveband without overlapping, and just one in the US, whilst only five 80MHz bands (four in the US) are available. A top-of-the-range access point with eight antennas and 160MHz channel support will require two Gigabit Ethernet links to the wired network to achieve full performance, too.

The 802.11ac standard also operates at 5GHz, which as we have already discussed has a shorter range than 2.4GHz, although the spectrum is less crowded, with no microwave ovens using that frequency. To mitigate the reduced range, the 802.11ac standard supports beam forming, meaning that the signal is not broadcast omni-directionally but specifically to each device, which should improve range when the wireless access point is not serving too many devices. But, again, if you have 802.11g devices on the network, they will operate at this speed and you won't see the benefit of the new 802.11ac technology. Although 802.11ac doesn't natively support 2.4GHz, most routers offer secondary radios to cater for 2.4GHz 802.11n, as well as other legacy devices.

The congestion problem is not just about the bandwidth available to each individual Wi-Fi connection, but the number of connections that an access point may be asked to support. In theory, an access point is only limited by the number of IP addresses it can dole out to connected devices, which will be 254 if it is acting as a DHCP server itself. Some routers limit the Wi-Fi allocation to fewer than 254, whilst a corporate WLAN using 802.1x authentication could support hundreds. But each device will grab a slice of the wireless connection, and the more devices there are the smaller the slices.

One technology that can be used to alleviate some of these stresses is Wireline. This takes advantage of existing wiring infrastructure that isn't Ethernet cabling as a conduit for high-speed networking. Unless you are in a remote log cabin in the forest, your location will have power cabling, but any coaxial (for example for TV aerials) or twisted pair wiring for telephones could be used. This avoids the cost of laying fresh dedicated Ethernet cabling, or the potential bandwidth and quality of service issues associated with wireless networking.

The ITU-T standard supports Gb/s data rates and operation over telephone wiring, coaxial cables, and power lines. As with Wi-Fi standards since 802.11n, MIMO technology is used with to increase data transfer rates and improve signal quality when interference occurs. There are even plans to embed in the power supplies of devices so that they can be networked via their AC connections. A router with embedded can use the Wireline network as a trunk between locations, so you can place access points in close proximity to your users. makes this possible without having to lay Ethernet cabling from wireless access points to the main wired network. A new access point could be installed wherever power line, coaxial or telephone cabling is available, and act as a gigabit-speed wireless hub to the network or Internet.

The fact that we can carry on working or playing, whether we are wired or not, has been a huge liberation for both business and entertainment. You can access your work documents from the cloud, log into the same desktop via VPN from office, hotel or home, and stream live television in bed. But for these capabilities to deliver on their promise, particularly as more devices become Wi-Fi-enabled, the wireless network needs to be configured wisely. Careful choice of access point and client devices, as well as using a technology like Wireline to bring the access points closer to your wireless users, will improve their experience considerably, and deliver the true promise of high-performance wireless networking.

This post is brought to you by Comcast Business.

Follow @comcastbusiness.