After iPhone 4S users experienced the convenience of having a know-it-all assistant that can spice up a conversation with some unexpected snappy answers, now hackers have taken it a step further.
Josh Evans and Ollie Hayward explained on their blog how they managed to convince Siri to read their minds; apparently, Siri is able to interpret 25 brain patterns and translate them into commands, the Sydney Morning Herald (opens in new tab) reports.
The experiment, called "Project Black Mirror" is revealed in a demonstration video which shows Josh Evans with electroencephalography pads on his forehead and a concentrated look. Seconds later, the connected iPhone 4S announces in the familiar voice "calling Graham" and places the call.
The process purportedly relies on recorded signatures of brain patterns of voice commands, and then the brain patterns and the commands were paired to make a 25 phrase dictionary. Actually, what Siri does in the demonstration is not mind reading, but "recognising" the meaning of brain patterns and then translating them to commands.
The technology is not entirely new, as proved by Emotiv and its creation, an EEG-based videogame controller. However, the news should be taken with a pinch of salt, as the human mind is hard to be reduced to brain patterns that are repeatable, and the technology is still far from providing significant and reliable results in this area.