How to design the AI device of the future

In 1990 Tim Bernes-Lee began writing the World Wide Web, an innovation that changed the world as we knew it forever. But as a five-year-old, my mind was being blown by a dancing flower. My brothers, friends and dog were all mesemerised by this incredible new toy that came to life and “danced” (jiggled) when music was played – it genuinely felt like sci-fi magic. Almost thirty years later and our dancing flowers have evolved into AI powered devices like Google Home, Amazon Echo and the new Apple HomePod - able to turn on lights, order pizzas and even tell us jokes. We’ve replaced dancing flowers in pots with talking bots that look like plant pots. Showing off what Google Home can do to friends still incites the same wide-eyed, sci-fi wonder that we felt as five-year-olds staring at that wobbling flower for the very first time. But can this magic be sustained, or just like the dancing flower will we become bored of it’s simple trick and move onto the next big thing? Where’s next for AI devices and what will they look like in the future?

As product technology advances, our AI assistants like Alexa will no longer need a physical home like the Amazon Echo, soon they’ll be everywhere at once. They will be accessible from a range of enabled devices - from our computers to our cars. And just as our phones have changed to be more than devices for making a call, soon the products in our homes will be multifunctional too – we’ll be able to tweet from a microwave or order a pizza through a lightbulb. The product will no longer be restrained by its primary function but will do multiple things – and our virtual assistants will be built-in, always on hand when we need them.

Human beings are tribal, and our tribes extend further than our families. Friends and pets can all become part of our familial tribe - we even form emotional connections to our objects. As AI devices become more integrated into our lives, it’s possible these too will become part of our family tribe. However, if AI is heading toward being omnipresent, we may lose a vital opportunity for it to become part of our physical world. Becoming an anywhere/anytime service is obviously appealing for the tech giants, but as human beings we still desire and attach ourselves to physical objects. A virtual assistant is helpful, but a physical object can become part of the family. 

Non-verbal machine communication

The AI devices on the market today are varied – ranging from nerdy geek-tech to unobtrusive objet d'art. It’s an industry still in its infancy, desperately trying to figure out what people want and trying a myriad of things in the process - from the simple to the complex. One of the main problems facing AI objects right now is basic communication, and it’s something no tech company is getting quite right. If 93 per cent of communication is non-verbal, then having an object “talk” to you simply isn’t enough. Objects that communicate with humans need to communicate in ways that we as humans instinctively understand – which is predominantly at a non-verbal level. Making a device spin when it’s ‘happy’ or flash when it’s ‘thinking’ are cute, but not human – we don’t do this. As humans we have to interpret this new “language” and remember it. The more things it does, the more we have to learn and the more confusing it becomes. The problem could be in the design process itself, currently we have humans designing devices to communicate with humans… in non-human ways - we’re making the challenge far more complex than it needs to be. The solution doesn’t have to be complicated, we just have to look outside the usual design process for guidance.

Product designers and tech developers are going to have to invite other disciplines to the table if they really want to create something special. Involving behavioral psychologists, linguistic experts and even puppeteers could teach companies how to bring an inanimate object to life in a way that we instinctively understand. Imagine an object that reacts with a range of complex non-verbal communication that even a child could understand - not only able to hear and speak but convey curiosity, shyness, fear, boredom, playfulness or confusion. The slightest tilt toward us would let us know it’s listening, a ‘look’ up would let us know it’s thinking. A beautiful piece of design, powered by AI but combined with human characteristics would not only be helpful in the home, but could become one of the most human pieces of technology we’ve ever created.

A lot of companies and start-ups are also making the mistake of trying to do all the heavy lifting of designing a beautiful object as well as all the behind the scenes artificial intelligence as well. Google are trying to make this easier by releasing the Google Assistant software development kit, which allows developers to build hardware with Google Assistant baked in. This is a win-win for both parties as product designers can focus their time on the device, and the tech giants can piggy back their assistive agents into more homes and ultimately more lives. The more the tech giants can create AI as a service that can be plugged into, the more time it frees up for product designers to create something wonderful.

In time, the battle for the assistive intelligence to rule them all may be won, but until then we shouldn’t have to settle for whatever object it comes packaged in. Product designers should be able to create a range of AI objects to suit the individual, the home or the family. AI objects should be as diverse as chairs are today, designed based on user needs and appealing to the individual – not just one solution for all. Virtual assistants are undoubtedly incredible and will go on to automate our lives in unexpected and wonderful ways. But when it comes to creating objects for the home and families let’s think a little less plant-pot and a little more flower pot. It’s worth remembering the importance of the object and that feeling we all had when we first saw that little plastic flower jiggling to “Never Gonna Give You Up”.

Tom Moran, Senior UX Designer, TH_NK
Image Credit: TH_NK