The IBM chip, called TrueNorth, is built of 4,096 tiny computing cores that form about a million digital brain cells and 256 million connections. Together they act like the brain's neurons, sending short messages to one another to process data.
The design, known as neuromorphic computing, marks a dramatic departure from traditional chips that run software packaged into strict sequences of instructions. Neuromorphic chips are also optimized to get large
amounts of processing done without consuming as much power as traditional chips.
Samsung has adapted TrueNorth into its Dynamic Vision Sensor that processes video imagery very differently than traditional digital cameras.
"Each pixel operates independently" and pipes up only if it needs to report a change in what it's seeing, said Eric Ryu, a vice president of research at the Samsung Advanced Institute of Technology. He spoke Thursday at an IBM Research event celebrating the 30th anniversary of its Almaden lab near Silicon Valley.
The result is a camera that can keep track of what's going on at a remarkable 2,000 frames of video per second. Ordinary digital cameras typically max out at 120fps. That speed is useful for creating 3D maps, safety features on self-driving cars and new forms of remote controls that recognize gestures.
Samsung demonstrates how a brain-like chip can be used to sense hand motions to control a TV.
Samsung's
TrueNorth-based system is very efficient, consuming about 300
milliwatts of power, Ryu said. That's about a hundredth of a laptop
processor and a tenth of a phone processor. By comparison, the brain
can handle some tasks with 100 million times less power than a computer,
though, he said.
"There is a huge gap between biology and modern silicon technology," Ryu said.
Samsung
demonstrated the chip recognizing hand gestures so a person could
control a TV. It recognized hand waves, finger waves, closed fists and
finger pinches from about 10 feet away.
Because the chips run
cool, Samsung expects to be able to stack them together into bigger
groups. IBM already gangs them together into 16-chip packages that come
closer to matching the scale of the roughly 86 billion neurons in a human brain.
IBM
is trying to get partners to explore what's possible with TrueNorth.
Other researchers noodling with TrueNorth spoke at the IBM event.
One
is the Air Force Research Laboratory, which is investigating TrueNorth
for use in identifying unusual events in video, detecting computer
attacks, turning printed text or audio into searchable data and
summarizing it, and giving drones autonomy when in flight but not
connected to a human controller at a distant military base.
A
drone "has to able to know where it is, what to do next, where to fly
next," said Qing Wu, a principal electronics engineer at the Air Force
lab. "We need very power-efficient processing on board. That's where we
believe IBM's TrueNorth chip can help dramatically."
The
Air Force is developing a control system that marries a traditional
computer with a TrueNorth system, Wu said, a combination that would be
analogous to the hemispheres in the brain. "The left brain is good at
sequential processing and math, and the right brain is good at inference
and understanding of the situation," Wu said.
Lawrence Livermore
National Laboratory, too, is using TrueNorth. The research facility has
tested it successfully to detect cars in cluttered situations in video
shot from above, sort of like video from a satellite or surveillance
aircraft. It's also working on integrating it with traditional
supercomputers.
No comments:
Post a Comment