Skip to main content

AI: Stephen Hawking warns that robots will spell doom for humans

World-renowned physicist Stephen Hawking has spoken out about the dangers of artificial intelligence.

In an article contributed to the Independent, Hawking, along with fellow scientists Frank Wilczek, Max Tegmark and Stuart Russell, warned against the continual development of advanced robotic systems.

Read more: AISight, the slightly scary CCTV network completely run by AI

The group identified self-driving cars, IBM's supercomputer Watson, Apple's Siri, Microsoft's Cortana and even Google Now as examples of technological advancements that might not have been thought out properly.

In 2011, Watson beat a group of Harvard and MIT business students on the gameshow Jeopardy!, before making headlines last year when it learnt how to swear. At MWC in February, IBM chief Virginia Rometty revealed that the supercomputer was being used in the education and medical sectors, and announced her intention to make cognitive computing a big part of the future.

While they concede that the creation of AI can represent one of the most impressive achievements in the history of mankind, they believe it could also lead down a very dangerous path.

"The potential benefits are huge," they write. "Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last."

We're not yet able to develop computers more powerful than the human brain, but it looks likely to happen within the coming years. What's more, Hawking and co reckon that it wll only be a matter of time before AI starts improving itself, and that's when humans will lose all control of the situation.

Read more: Supercomputer takes 40 mins to calculate a second of human brain activity

"One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders and developing weapons we cannot even understand," they continue.

"Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all."

Image credit: Flickr (Péter Kelemen)