According to a recent report by Blueshift, over three quarters (81 per cent) of marketers using AI and machine-based learning for personalisation are exceeding revenue goals by at least 30 per cent. However, it’s well documented that gender biases are ingrained into AI tools – as demonstrated by researchers at Bath University – so businesses need to have the right practices in place in order to succeed in using it.
AI plays an important role in personalisation, but it can create outcomes that could be biased. Getting ahead of AI bias by considering potential outcomes and creating an approach that favours inclusive scenarios can eliminate and even counteract detrimental bias in personalisation. From the team businesses hire, to the technology they create, in 2020, companies strive to ensure these advancements meet the needs of all people regardless of gender, race, wealth, education or other differences that could result in bias.
From inaccurately suggesting a size of clothing a consumer might like based on previous purchases, to overtly personalising content with models of a specific background, businesses are defined by these subliminal messages when engaging with their audiences. With the ongoing controversy surrounding AI, it is critical that businesses course correct in terms of equal opportunity and unintentionally biased technologies to ensure they’re providing the best possible customer experience and not failing on inclusion.
The importance of blending data with context
When we consider adding AI to personalisation, it’s like asking a famous chef to cook a special meal for someone. The chef needs more than good ingredients. They need to know if that person has any special dietary requirements, certain preferences or food allergies so they can enjoy it and avoid a negative outcome. This is the same with personalisation – it works best with a blend of good data and an understanding of who is on the receiving end to deliver the right experience.
In a McKinsey report, it notes that personalisation is a “huge boon” for companies and consumers alike. It continues to comment that relevant and useful communications can create lasting customer loyalty and drive revenue growth of 10 to 30 per cent. It also has the power to increase engagement, improve relevance and drive higher ROI and lifetime customer value. When done well, customers appreciate the “tailored experience”, as long as it doesn’t cross lines and delivers genuine value and relevance. Despite Gartner’s recent report claiming that 80 per cent of brands will abandon personalisation in five years, it is very much still front of mind for businesses and a part of their 2020 strategy as they see the importance of providing contextualised communications to their audience.
With AI, companies are finding that they can scale their personalisation tremendously, and deliver more accurate, effective consumer engagements. It’s up to the teams that manage this software to plan ahead to make the most of these powerful new tools, layering data with real case examples of user interaction to build a richer experience.
Not all AI systems are equal
We often see stories in the news referring to ‘AI gone bad’. From a sexist chatbot to a racist Netflix algorithm, companies don’t always plan ahead to ensure that AI and other automated tools push for progress. The changes need to start from inside the organisation, where many senior AI roles within a company are more likely to be held by men. This has a knock-on effect as these machines have a male-leaning design and can fall short of delivering a gender equal experience. Women must be given a voice and role to continue progress and, with men, shape the future that is bias-free.
An insightful paper by MIT researchers urges companies to get ahead of AI bias by thinking about potential outcomes. These brands have to consider the effects of technology on their approach to diversity and inclusion to be truly sensitive to their consumers across all channels of communication.
Algorithms might pick up on the size of clothing people shop for, the colour of the skin in the ads they linger over, or the price of goods they often buy, and that can be reflected in content and advertising. In an untrained system, these insights might end up showing an innocent customer an exaggerated display of assumptions that can trigger a very negative reaction, for example, that all women have the same body shape or certain groups of people prefer one type of product. Consumers want their data to be interpreted in a way that’s meaningful to them, so businesses have a responsibility to process customer data and uncover insights that allow them to create authenticity in their communications.
Consider the effects of technology on diversity and inclusion
Companies like IBM are paving the way to inherently improve how gender and race are represented in technology to fight against bias. Last year, it released a new dataset, which tagged features from 1 million photos taken from Flickr to map facial measurements, symmetry, age and gender to help tackle biases in facial recognition technology. Initiatives like this help combat the problem, and benefit the accuracy of AI for personalisation purposes.
Expert partners can help brands create a strategy to make the most of powerful AI that drives better personalisation. It’s helpful to build upon the expertise of what works best in various scenarios. It is much better to plan ahead and build positive outcomes into the system before consumers come into the mix – therefore creating the ideal platform from the outset. For businesses to avoid these biases, it is vital they work with AI specialists whose technology is sophisticated enough to overcome potential diversity issues.
It may be that bias is an unavoidable fact of life, but it should be avoidable when it comes to powerful new technologies. The industry has far to go to deploy a perfect AI system free of bias, but the key is to create a strategy and make the most in-tune AI sensitive to consumers’ backgrounds and needs across all communication channels.
Lisa Kalscheur, founder and co-chair, AppNexus Women's Network