Sunday, August 21, 2011

IBM's New Chips Compute More Like We Do

Via MIT Technology Review -

A microchip with about as much brain power as a garden worm might not seem very impressive, compared with the blindingly fast chips in modern personal computers. But a new microchip made by researchers at IBM represents a landmark. Unlike an ordinary chip, it mimics the functioning of a biological brain—a feat that could open new possibilities in computation.


The IBM researchers have built and tested two demonstration chips that store and process information in a way that mimics a natural nervous system. The company says these early chips could be the building blocks for something much more ambitious: a computer the size of a shoebox that has about half the complexity of a human brain and consumes just one kilowatt of power. This is being developed with $21 million in funding from the Defense Advanced Research Projects Agency, in collaboration with several universities.

The company's researchers and their academic collaborators will present two papers next month at the Custom Integrated Circuits conference in San Jose, California, showing that the chip designs have very low power requirements and work with neural-circuit-mimicking software. In one experiment, a "neural core," as the new chips are called, learns to play Pong; in another, it learns to navigate a car on a simple race track; and in another it learns to recognize images.


So far the IBM team has demonstrated only very basic software on these chips, but they have laid the foundation for running more complex software on simpler computers than has been possible in the past. In 2009, Modha's group ran simulations of a neural network as complex as a cat's brain on a supercomputer. "They cut their teeth on massive simulations," says Michael Arbib, director of the USC Brain Project. "Now they've come up with chips that may make it easier to [run cognitive computing software]—but they haven't proven this yet," he says.

Modha's group started by modeling a system of mouse-like complexity, then worked up to a rat, a cat, and finally a monkey. Each time they had to switch to a more powerful supercomputer. And they were unable to run the simulations in real time, because of the separation between memory and processor that the new chip designs are intended to overcome. The new hardware should run this software faster, using less energy, and in a smaller space. "Our eventual goal is a human-scale cognitive-computing system," Modha says.


Step 1 = Run complex simulations with best-of-breed hardware
Step 2 = Learn lessons from completed simulations > knowledge
Step 3 = Identify hardware limitations & methods to improve process > knowledge
Step 3 = Design & implement knowledge / identified improvements into new hardware
Step 4 = Run [more] complex simulations with [newly designed] best-of-breed hardware
Step 5 = Goto Step 2

No comments:

Post a Comment