MIT unveils low-power chip for AI

MIT researchers have developed a special-purpose chip that increases the speed of neural-network computations by 3-7 times over its predecessors, while reducing power consumption by 94-95%.

This could make it practical to run neural networks locally on smartphones or to embed them in household appliances.

Neural networks are densely-interconnected meshes of simple information processors that learn to perform tasks by analysing huge sets of training data.

Neural networks are large, however, and their computations are energy intensive – rendering them impractical for handheld devices.

Most smartphone apps that currently rely on neural nets upload data to Internet servers, which process it and send the results back to the phone.

MIT’s special-purpose chip may solve this problem.

Now read: MIT unveils energy-efficient encryption for Internet of Things

Latest news

Partner Content

Show comments


Share this article
MIT unveils low-power chip for AI