IBM builds software to program computer chips that 'act like the brain'

Scientists from IBM have unveiled a software system to support the programming of computer chips that can act like the brain.

The chips in question would have an architecture "inspired by the function, low power and compact volume of the brain," said Big Blue. The technology could "enable a new generation of intelligent sensor networks that mimic the brain’s abilities for perception, action, and cognition," it added.

Such technology could be used for sensory-based data input capabilities and on-board analytics for automobiles, medical imagers, healthcare devices, smartphones, cameras and robots, said IBM.

IBM’s new programming differs from the sequential operation underlying today's von Neumann architectures and computers. It is instead "tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures," IBM said.

Dr Dharmendra Modha, principal investigator and senior manager at IBM Research, said, “Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm. We are working to create a FORTRAN for synaptic computing chips.

"While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.”

To advance and enable the new programming ecosystem for these new chips, IBM researchers have developed technology to support all aspects of the programming cycle from design through to development, debugging and deployment.

IBM has developed a multi-threaded, massively parallel and highly scalable functional software simulator of a cognitive computing architecture, comprising a network of neurosynaptic cores.

There is also a simple, digital, highly parameterised spiking neuron model, that "forms a fundamental information processing unit of brain-like computation," and supports a wide range of deterministic and stochastic neural computations, codes and behaviours. A network of such neurons "can sense, remember and act upon a variety of spatio-temporal, multi-modal environmental stimuli."

In addition, there is a programming model based on composable, reusable building blocks called “corelets.” Each corelet represents a complete blueprint of a network of neurosynaptic cores that specifies a based-level function.

A cognitive system store, or library, contain designs and implementations of consistent, parameterised, large-scale algorithms and applications that link massively parallel, multi-modal, spatio-temporal sensors and actuators together in real-time. In a year IBM researchers have designed and stored over 150 corelets in this program library.

A teaching curriculum, or laboratory, spans the architecture, neuron specification, chip simulator, programming language, application library and prototype design models. It includes an end-to-end software environment that can be used to create corelets, access the library, experiment with a variety of programs on the simulator, connect the simulator inputs/outputs to sensors/actuators, build systems and visualise the results.

IBM's efforts on "cognitive computing" are being presented at The International Joint Conference on Neural Networks in Texas this week.