Search Tools Links Login

IBM's AI Chip Revolution: Energy-Efficient Genius


IBM's latest AI chip promises groundbreaking energy efficiency in speech recognition, outperforming traditional microchips by over twelve times. This innovation could be a game-changer for notable AI systems, including large language models like ChatGPT and generative AI used in visual content creation.

While AI has significantly improved transcription accuracy over the years, the increasing cost and energy consumption of supporting hardware is a concern. Notably, training OpenAI's GPT-3 consumed a whopping $4.6 million, powering 9,200 GPUs over a fortnight. The energy drain primarily arises from data transfer between processors and memory, sometimes being thousands of times more than the actual computation.

To combat this, neuromorphic hardware, mimicking brain functions, is gaining traction. IBM's research centered on phase-change memory which, unlike traditional methods, can encode results of deep neural network computations using minimal resistors or capacitors. This breakthrough resulted in a microchip with energy efficiency hundreds of times superior to the top CPUs and GPUs.

The IBM team tested their creation using two neural-network programs for speech recognition, achieving astonishing results. For instance, their device processed Google Speech Commands seven times faster and was 14 times more energy efficient with Librispeech compared to standard hardware.

This innovative microchip is also equipped to support transformers – the neural network backbone behind models driving chatbots like ChatGPT. Besides their proficiency in speech, transformers underpin generative AI systems renowned for art generation. Intel Labs' expert, Wang, believes this chip could significantly slash power and expenses associated with large language models and generative AI.

Nevertheless, these AI models aren't free from controversy. Challenges like errors from ChatGPT or intellectual property concerns with generative AI persist. Moreover, Wang highlights that IBM's chip isn't fully self-sufficient and depends on external components.

Wang envisions a roadmap with five essential advancements needed for commercially viable analog AI. These include specialized circuits, a fusion of analog-digital architecture, customized compilers, algorithms suited for analog computing errors, and applications tailored for analog chips. Analog AI, in Wang's view, is just taking baby steps, with a lengthy journey ahead.

These findings were published in the journal Nature on 23 August.

About this post

Posted: 2023-08-28
By: dwirch
Viewed: 140 times

Categories

News

AI

Attachments

No attachments for this post


Loading Comments ...

Comments

No comments have been added for this post.

You must be logged in to make a comment.