Skip to main content

IBM pushes its cognitive computing vision

IBM is looking to the world of nature to create next-generation computing systems, creating a process with digital 'neurons' and 'synapses' that processes data in the same way as the human brain.

While much of the company's research into next-generation processing platforms has been simple evolutionary leaps for existing systems - such as switches to graphene-based processors, photonic computing systems, or memristor-based circuits - the company's latest project is looking towards nature to find inspiration.

Dubbed 'Systems of Neuromorphic Adaptive Plastic Scalable Electronics' - or SyNAPSE - the project has already succeeded in creating two prototype chips which recreate the phenomena between spiking neurons and synapses in the human brain through novel algorithms and advanced circuitry.

It's a move which not only promises massive improvements in performance and power consumption, but that looks to change the way computing works on a fundamental level. Rather than simply calculating the results of problems posed to it, a cognitive computer powered by IBM's neurosynaptic processor would learn from experience, find correlations, create hypotheses, and remember previous outcomes to vastly improve the kinds of data that can be mechanically processed.

If that all sounds a bit sci-fi and far fetched, it's probably worth mentioning at this point that the Defense Advanced Research Projects Agency has ponied up an impressive $21 million in funding to see the project taken to the next phase. That's the same DARPA, by the way, which created the ARPANET which would later grow into the Internet we know and love.

The project has been running at IBM since 2009, and has no lesser goal than the creation of a computing system which takes information in from multiple sensory modalities simultaneously, rewriting itself as it interacts with its environment while rivalling a brain for compact size and low power draws.

"This is a major initiative to move beyond the von Neumann paradigm that has been ruling computer architecture for more than half a century," explained IBM Research project leader Dharmendra Modha. "Future applications of computing will increasingly demand functionality that is not efficiently delivered by the traditional architecture. These chips are another significant step in the evolution of computers from calculators to learning systems, signaling the beginning of a new generation of computers and their applications in business, science and government."

IBM is hoping to avoid anyone thinking it's creating a Robocop-esque fusion of man and machine. While it uses biological terms like 'synapse,' 'neuron,' and 'axon,' the neurosynaptic processors themselves contain no biological material and are entirely electronic.

The company's two existing prototypes are constructed on a 45nm process size and contain 256 neurons each, with the original core holding 262,144 synapses that can be programmed, and the other holding 65,536 synapses which actually learn from experience. Simple tasks like navigation, machine vision, pattern recognition, associative memory, and classification have all been demonstrated on the chips in small-scale trials.

Those prototypes are small potatoes to the company's overall goal, however: a network of neurosynaptic processors featuring ten billion neurons and a hundred billion synapses, capable of massive feats of parallel processing in a volume of just two litres and consuming a mere kilowatt of power.

IBM isn't the only company looking towards the natural world for the next big leap in computing efficiency, however: microcomputing veteran Steve Furber is building a system which combines standard ARM processing cores in a way that mimics the connections of a biological system in an effort to simulate spiking neurons in biological real time.

While Furber's project promises massive performance benefits for the supercomputing industry, it's possible that - if the technology scales in the way that IBM believes it will - SyNAPSE will recreate the very foundation of how computation happens.