Skip to main content

What will advances in natural language processing mean for smart devices?

(Image credit: Image source: Shutterstock/polkadot_photo)

As branch of artificial intelligence (AI) that deals with the interaction between computers and humans, natural language processing (NLP) is becoming more intelligent and revolutionising the way we interact with smart devices and chatbots. As developments in this area continue at pace, Gartner estimates that by 2020 the NLP market will be worth $13.4 billion and with machine learning becoming ever more sophisticated, the way people communicate with their smart devices will continue to evolve. So, what changes are we currently seeing in the way people interact with smart devices and how will this progress in the future?

Determining user intent

One of the biggest of advances we’re seeing is that users are no longer having to be explicit with the requests and questions they are asking their smart device. This is something that will continue to improve as the AI becomes more powerful. Previously, when asking a smart device to perform an action, the user had to be definitive, using specific words or phrases and in a particular order. Now, consumers can be much more natural in the way they interact with devices. For instance, when setting an alarm, the user can now say phrases such as “set the alarm for half 6”, rather than the more formal “set the alarm for six-thirty”, as the technology being used is getting better at understanding more colloquial expressions and conversational tones.

Increasingly, smart devices are also becoming much more intuitive, with action or feedback tailored to the first comment the user makes. For example, if the consumer says, “what time is ‘X’ programme on tonight?”, AI will process this initial command and intuit what the user might want next, perhaps asking the user if they would like them to set a reminder for the programme or record it. Through the use of AI, the device has an awareness of the user as a person and what they might want tied to the original command, as well as the other devices the user has within their home. In these situations, smart devices aren’t merely using a simple matrix, rather AI is ensuring that all databases are updated with the right information and that the flow of information is of consequence, based on the discussion with the user.

Developing user profiles

As NLP improves, smart devices will soon be able to recognise different users based on their voice. This will enable them to build up digital profiles on each user within the home and remember information about them, such as their favourite shows or what time they want their alarm set for each morning. As a result, when a particular user asks their device to turn on the television, it will not only turn on the television but will also ask them if they would like to watch a specific programme based on their viewing habits. Currently, as smart devices tend to be set up by one person within the home, other users can find that they struggle to be understood by the device. However, with advances in NLP, smart devices will become more intelligent and able to recognise different users, meaning that just because they aren’t the main user, others are still able to interact with the device.

In the long term, this could potentially go even further, allowing users to switch off security alarms with their voice on entering their home, for example, with the device recognising that the voice is live and not just a recording. Soon enough, advances in NLP could even result in smart devices being able to talk back in a more conversational manner, using voice recognition and the profiles to address users by name.

A better understanding of accents

As well as struggling to switch between users, the majority of smart devices have also been poor at understanding different or strong accents. Research finds that nearly four-fifths of speakers with regional accents deliberately adjust the way they speak in order to ensure voice recognition systems understand them. This ties in with the fact that 30 per cent of Northern Irish and 45 per cent of Welsh people say they struggle to be understood by smart home devices, while 42 per cent of us slow down our speech. Similarly, it has often been commented that smart devices have been much better at understanding men’s voices than women’s. Fortunately, advances in AI are helping smart devices overcome these challenges by essentially removing accents or dialects from the language processing side of things which is helping them to understand what the consumer is saying first time round.

Moving beyond voice control

While there have been substantial developments in NLP, there are still some things that aren’t quite suited to voice, such as turning the volume up or scrolling when looking for a film to watch. As a result, it’s likely that in the future we’ll see more smart devices featuring sensors which can detect movement to allow users to create physical commands for these actions. In these scenarios, users would be able to move their arm up or down to indicate which way they want the volume to go or move their arm across to prompt their television to scroll to the next page. Further to this, we may even see some smart devices introduce facial recognition. However, the introduction of facial recognition may open up another can of worms in terms of privacy, causing users concern about whether they are being watched. As a compromise, we could instead see sensors being used to help devices recognise each user based on their build. For example, the sensors would be able to show how tall each user is and add that to their profile, along with what their voice sounds like and what their habits are. This means that users won’t have to be concerned that their device can actually see them and record their image.

It could be argued that we’re only just scratching the surface of what NLP has to offer smart devices. In just the last few years we have seen the functionality and understanding of smart devices increase substantially and as advances in both NLP and AI increase, this looks set to continue. We may even get to a point where we don’t need to ask our smart devices something for it to know what we want, with the ability to predict our wants or needs based on our daily routines. Ultimately, as NLP continues to advance, users will be able to do even more with their smart device and much more easily. In years to come, it’s likely we won’t know how we functioned without them.

Kuldip Singh Johal, VP of sales for subscription broadcasting, Universal Electronics
Image source: Shutterstock/polkadot_photo