The scene on the picture above is from the 2014 sci-fi movie Robocop, and in this scene our hero, the Robocop, analyses his enemy's emotional condition and concludes that violence is imminent.
You can say he successfuly detected emotions. For a living person, that isn't really a big deal, but for a computer, that's science fiction. Or, to be precise, that *was* science fiction. Not any more.
Microsoft has announced plans to release public beta versions of new tools in its Project Oxford, one which can help developers create smarter apps, including those that can recognise emotion.
Project Oxford includes four main components - face recognition, speech processing, visual tools, and language understanding intelligent service (LUIS).
The new tools are ideal for those who are not really experts in artificial intelligence, but would still love to use AI capabilities in their apps.
The emotion tool can be used to create systems that recognise eight core emotional states – anger, contempt, fear, disgust, happiness, neutral, sadness or surprise – based on universal facial expressions that reflect those feelings. The tool is available to developers as a public beta.
"Developers might want to use these tools to create systems that marketers can use to gauge people’s reaction to a store display, movie or food. Or, they might find them valuable for creating a consumer tool, such as a messaging app, that offers up different options based on what emotion it recognises in a photo," Ryan Galgon, a senior program manager within Microsoft’s Technology and Research group said.