New York Times
Technology
A schematic showing the layout of the new processor, named TrueNorth.
IBM
By JOHN MARKOFF
Inspired by the architecture of the brain, scientists have developed a new kind of computer chip that uses no more power than a hearing aid and may eventually excel at calculations that stump today’s supercomputers.
The chip, or processor, is named TrueNorth and was developed by researchers at IBM and detailed in anarticle published on Thursday in the journal Science. It tries to mimic the way brains recognize patterns, relying on densely interconnected webs of transistors similar to the brain’s neural networks.
The chip’s electronic “neurons” are able to signal others when a type of data — light, for example — passes a certain threshold. Working in parallel, the neurons begin to organize the data into patterns suggesting the light is growing brighter, or changing color or shape.
The processor may thus be able to recognize that a woman in a video is picking up a purse, or control a robot that is reaching into a pocket and pulling out a quarter. Humans are able to recognize these acts without conscious thought, yet today’s computers and robots struggle to interpret them.
A silicon chip relies on webs of transistors similar to the brain’s neural networks. I.B.M.
The chip contains 5.4 billion transistors, yet draws just 70 milliwatts of power. By contrast, modern Intel processors in today’s personal computers and data centers may have 1.4 billion transistors and consume far more power — 35 to 140 watts.
Today’s conventional microprocessors and graphics processors are capable of performing billions of mathematical operations a second, yet the new chip system clock makes its calculations barely a thousand times a second. But because of the vast number of circuits working in parallel, it is still capable of performing 46 billion operations a second per watt of energy consumed, according to IBM researchers.
The TrueNorth has one million “neurons,” about as complex as the brain of a bee.
“It is a remarkable achievement in terms of scalability and low power consumption,” said Horst Simon, deputy director of the Lawrence Berkeley National Laboratory.
He compared the new design to the advent of parallel supercomputers in the 1980s, which he recalled was like moving from a two-lane road to a superhighway.
The new approach to design, referred to variously as neuromorphic or cognitive computing, is still in its infancy, and the IBM chips are not yet commercially available. Yet the design has touched off a vigorous debate over the best approach to speeding up the neural networks increasingly used in computing.
The idea that neural networks might be useful in processing information occurred to engineers in the 1940s, before the invention of modern computers. Only recently, as computing has grown enormously in memory capacity and processing speed, have they proved to be powerful computing tools.
In recent years, companies including Google, Microsoft and Apple have turned to pattern recognition driven by neural networks to vastly improve the quality of services like speech recognition and photo classification.
But Yann LeCun, director of artificial intelligence research at Facebook and a pioneering expert in neural networks, said he was skeptical that IBM’s approach would ever outpace today’s fastest commercial processors.
“The chip appears to be very limited in many ways, and the performance is not what it seems,” Mr. LeCun wrote in an email sent to journalists. In particular, he criticized as inadequate the testing of the chip’s ability to detect moving pedestrians and cars.
“This particular task,” he wrote, “won’t impress anyone in computer vision or machine learning.” Mr. LeCun said that while special-purpose chips running neural networks might be useful for a range of applications, he remained skeptical about the design IBM has chosen.
Several neuroscience researchers and computer scientists disputed his critique.
“The TrueNorth chip is like the first transistor,” said Terrence J. Sejnowski, director of the Salk Institute’sComputational Neurobiology Laboratory. “It will take many generations before it can compete, but when it does, it will be a scalable architecture that can be delivered to cellphones, something that Yann’s G.P.U.s will never be able to do.”
G.P.U. refers to graphics processing unit, the type of chip being used today to deliver graphics and video to computer screens and for special processing tasks in supercomputers.
IBM’s research was funded by theDefense Advanced Research Projects Agency, a research arm of the Pentagon, under a program called Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNapse. According to Gill Pratt, the program manager, the agency is pursuing twin goals in its effort to design ultralow-power biological processors.
The first, Dr. Pratt said, is to automate some of the surveillance done by military drones. “We have lots of data and not enough people to look at them,” he said.
The second is to create a new kind of laboratory instrument to allow neuroscientists to quickly test new theories about how brains function.
Correction: August 7, 2014
Because of an editing error, an earlier version of this article misstated the day on which the report of a new computer chip was published. It was Thursday, not Wednesday.
Correction: August 8, 2014
An earlier version of this article omitted the last word in the name of a program known by the acronym SyNapse, which funded IBM’s research. It is Systems of Neuromorphic Adaptive Plastic Scalable Electronics.