If you’re building products or offering services, then you’ve got them; because trust will be the most important word for the coming decade.
Another important word will be attention.
In an era defined by an overabundance of information, our attention is becoming an increasingly valuable commodity, and we’re beginning to think more critically about how and where that attention is best spent.
We live busy lives, and as the capabilities of day-to-day technology increase further and further, we’re now able to outsource our attention to products and services, such as Google maps taking over our need to navigate, but ideally to products and services that we trust.
Mark Zuckerberg famously wears a grey t-shirt every day.
He doesn’t devote his valuable attention to deciding on an outfit in the morning - he spends it elsewhere.
The rest of us may not dress in a Silicon-Valley-style uniform, but in some areas of our lives, we’re all wearing the grey t-shirt.
Have you ever asked Google maps the best route for your home-time commute and passively turned onto an unfamiliar road at the app’s suggestion?
If so, you were wearing the grey t-shirt in this moment, because you choose to devote your attention to making countless other decisions.
There was a time when we didn’t simply accept Google maps’ directions though.
Maybe you marvelled the first time you encountered it’s technology; maybe you were sceptical.
But when you first use a service like this, it's pure magic. ‘How the hell did the thing know to point me in the best route?’
Then you started to take it for granted, you become less and less impressed, and you might start questioning.
A commute taking a few minutes longer is no big deal, but what if a new AI machine built to diagnose serious illnesses recommends that your right arm should be removed? You’re going to want a second opinion on that, and questioning begins at the extremes.
But how does it work?
Imagine a new search engine for flights is launched, believed to be best one yet.
You have a conference to get to in Boston, so you submit your dates and destination and it confidently announces it’s found you the best possible flight. ‘You can’t beat this deal!’ it tells you.
As you’re parting with a lot of money, you decide to take the challenge and scour competing search engines and online forums.
As it turns out, the new platform was right - you can’t find a better deal. You can’t find a better deal the second or the third time you use the same platform, either. The fourth time you book, you don’t even bother hunting for a better deal. You’re now wearing the grey t-shirt. Your time can be put to better use elsewhere. The platform has won your trust.
Then, the platform takes it to the next level. It starts asking questions: do you have a family? What’s your job? What’s your income? Where did you go on vacation last summer? And the summer before that? Which one did you enjoy more?
With a little bit of clustering and classification, the platform suggests a vacation spot perfectly tailored to your needs, so you go ahead and book a flight.
Now the platform is in an interesting position, it knows something new about your future. And it can make money based off that information, getting paid to suggest hotels where you can stay and restaurants you might like and activities you’d probably enjoy.
Perhaps you take up those suggestions and have a wonderful holiday all the same. But would you have taken the platform’s advice if you’d known that it wasn’t exactly neutral?
Now, more than ever, consumers are beginning to understand and question how products and services like the flight search engine work. We know that these platforms ostensibly help us: but we also know they’re helping the profit margins of their owners, too.
If it's free...
When we trust a product or service, we let it do our thinking for us. But if our trust is breached, if our data is sold to others or used against us, we become cautious.
Maybe we still wear the grey t-shirt though. After all, it’s easier to book through the flight search engine, but we know that wearing it comes at a cost. From this knowledge, we start to change our behaviour and question things.
This questioning starts at the extremes, but once we understand the myriad of ways our trust has been breached, our scepticism will extend to day-to-day AI. The dirty secrets behind some of the technology that makes our lives so much easier are already starting to come to light, such as how Amazon Alexa will only ever recommend Amazon Basics batteries, rather than cheaper or higher-quality alternatives.
Short-term, it might benefit your bottom line to breach your user’s trust in exchange for a greater profit. But the tide of public opinion is fast turning and, in the long-term, if your company can’t demonstrate trustworthiness, it’s likely to be swept away.
If you’re collecting and analysing consumer data, it’s possible that data has already been impacted by trust issues. We’re willing to give our data to products and services, provided they give us something in return. But when we understand that the advice and suggestions we are offered are somehow compromised, we become a lot stingier with our personal information.
To collect high-quality, trustworthy data, you need to offer a high-quality, trustworthy service.
Think of a therapist: her clients unburden their deepest, darkest secrets to her. Sometimes they tell her things they’ve never told their closest friends or family.
They trust her.
They trust she will use the information she receives only for its intended purpose. They understand she needs this information to personally tailor her advice to them, and they know that the more information they give, the better the advice will be that they receive in return.
So what firms are asking for your secrets: Amazon? Tinder? Facebook? Can you trust them like you would your therapist? Are they keeping your secrets private? Are they offering you the best possible service in return?
What about your smart calendar? Is it suggesting the best venue for your annual general meeting? Or merely the venue that it’s being paid to push on you? And what is it doing with all the information it has on your habits, routines, and schedule?
There’s a saying in tech: When a service is free, you’re the product.
Unlike a lot of other smart calendars and scheduling services, Doodle is completely independent. We work to be as trustworthy and transparent as possible and Doodle users pay to access our premium product: their data doesn’t pay for them, allowing users to turn their attention away and safely use it elsewhere.
When something as valuable as human attention is on the line, we strongly believe that tech companies have an obligation to ensure their users can put it into your hands with good faith.
Matty Mariansky, co-founder of Doodle’s AI scheduling tool Meekan
Image Credit: Shutterstock/xtock