Latest in Tomorrow

Image credit: ktsimage via Getty Images

MIT has a new chip to make AI faster and more efficient on smartphones

These chips process data seven times faster using 95 percent less power than traditional means.
507 Shares
Share
Tweet
Share
Save

Sponsored Links

ktsimage via Getty Images

Just one day after MIT revealed that some of its researchers had created a super low-power chip to handle encryption, the institute is back with a neural network chip that reduces power consumption by 95 percent. This feature makes them ideal for battery-powered gadgets like mobile phones and tablets to take advantage of more complex neural networking systems.

Neural networks are made up of lots of basic, interconnected information processors that are interconnected. Typically, these networks learn how to perform tasks by analyzing huge sets of data and applying that to novel tasks. They're used for now-typical things like speech recognition, photo manipulation, as well as more novel tasks, like reproducing what your brain actually sees and creating quirky pickup lines and naming craft beers.

The problem is that neural nets are big, and the computations they run through are power-intensive. The ones in your phone tend to be tiny for that reason, which limits their ultimate practicality. In addition to power decreases, the new MIT chip increases the computation speed of neural networks by three to seven times over earlier iterations. The researchers were able to simplify the machine-learning algorithms in neural networks to a single point, called a dot product. This represents all the back and forth movement of various nodes in the neural network and obviates needing to pass that data back and forth to memory, like in earlier designs. The new chip can calculate dot products for multiple nodes (16 nodes in the prototype) in one step instead of moving the raw results of every computation between the processor and memory.

IBM's vice president of AI Dario Gil thinks this is an huge step forward. "The results show impressive specifications for the energy-efficient implementation of convolution operations with memory arrays," he said in a statement. "It certainly will open the possibility to employ more complex convolutional neural networks for image and video classifications in IoT in the future."

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
507 Shares
Share
Tweet
Share
Save

Popular on Engadget

Engadget's Guide to Privacy

Engadget's Guide to Privacy

View
First 'Borderlands 3' event is the Halloween-themed Bloody Harvest

First 'Borderlands 3' event is the Halloween-themed Bloody Harvest

View
Tech industry sets official standard for 8K TVs

Tech industry sets official standard for 8K TVs

View
'Bandersnatch,' 'Fleabag,' and 'Ozark' lead streaming Emmy winners

'Bandersnatch,' 'Fleabag,' and 'Ozark' lead streaming Emmy winners

View
IKEA will produce more energy than it consumes by 2020

IKEA will produce more energy than it consumes by 2020

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr