Eyeriss – MIT Researchers Unveil Low Power Chip That Can Power Mobile AI
A team of Researchers at MIT, including one of Indian-origin, have announced the creation of a new energy-efficient chip with 168 cores – Eyeriss – that is able to perform powerful tasks related to artificial intelligence(AI).
The team of Researchers from Massachusetts Institute of Technology (MIT) developed a new chip designed specifically to implement neural networks.
The team of Researchers presented their findings at the International Solid-State Circuits Conference in San Francisco, CA recently.
According to MIT, neural networks tend to be implanted into the GPU (graphics processing unit) within mobile devices. This is because a mobile GPU may have as many as 200 processing units or cores, making it a good option for the creation of a network.
Eyeriss, the new chip, is 10 times as efficient as a mobile GPU (Graphics Processing Unit). It performs complex artificial intelligence(AI) algorithms locally, without communicating with a server over an internet connection, thereby eliminating a major battery-draining operation.
The GPU is a specialised circuit designed to accelerate the image output in a frame buffer intended for output to a display.
Modern smartphones are equipped with advanced embedded chipsets that can perform many different tasks depending on their programming.
GPUs are an essential part of those chipsets and as mobile games are pushing the boundaries of their capabilities, the GPU performance is becoming increasingly important.
“Deep learning is useful for many applications such as object recognition, speech and face detection,” said Vivienne Sze, assistant professor in MIT’s department of electrical engineering and computer science, in a statement.
“Right now, the networks are pretty complex and are mostly run on high-power GPUs. You can imagine that if you can bring that functionality to your cell phone or embedded devices, you could still operate even if you don’t have a Wi-Fi connection,” said Vivienne Sze, a professor at the MIT Department of Electrical Engineering and Computer Science, in a statement. “You might also want to process locally for privacy reasons. Processing it on your phone also avoids any transmission latency, so that you can react much faster for certain applications.”
In matters related to IoT (Internet of things), this chip, Eyeriss, could empower smart devices and embedded implants to make decisions locally without having to share all the data they collect with a central core on the internet.
At the conference, the MIT researchers used “Eyeriss” to implement a neural network that performs an image recognition task. It was for the first time that a state-of-the-art neural network has been demonstrated on a custom chip. The Eyeriss is partially funded by DARPA.