[USA] Researchers develop a state-of-the-art device to make AI more energy-efficient

On July 26, 2024, researchers at the University of Minnesota Twin Cities demonstrated a device that could reduce energy consumption in AI applications by a factor of 1000. [1] With the increasing demand for AI, it has become imperative for the technology to become more energy efficient. Conventionally, AI processes transfer data between logic and memory, thereby consuming a large amount of power; the team of researchers demonstrated a new model where the data never leaves the memory, known as computational random-access memory (CRAM). In March of 2024, the International Energy Agency (IEA) forecasted that AI energy consumption would likely double from 460 TWh in 2022 to 1000 TWh in 2026, a number roughly equal to the electricity consumption in all of Japan.

CRAM directly performs computations within memory cells, efficiently utilizing the array's structure and eliminating the need for slow, energy-consuming data transfers. The team plans to work with semiconductor industry leaders to provide large-scale demonstrations and produce the hardware.

[1] https://cse.umn.edu/ece/news/new-hardware-device-make-artificial-intelligence-applications-more-energy-efficient