Thermodynamic Computing Offers Promising Path to Energy-Efficient Image Generation

Stephen Whitelam, an experimentalist formerly on staff at the Lawrence Berkeley National Laboratory in California, has performed trailblazing work. Together with a colleague, he showed how thermodynamic computing can reduce energy use by an order of magnitude in artificial intelligence-based image generation. These urgent questions have now been answered—reported on January 10, the results were…

Tina Reynolds Avatar

By

Thermodynamic Computing Offers Promising Path to Energy-Efficient Image Generation

Stephen Whitelam, an experimentalist formerly on staff at the Lawrence Berkeley National Laboratory in California, has performed trailblazing work. Together with a colleague, he showed how thermodynamic computing can reduce energy use by an order of magnitude in artificial intelligence-based image generation. These urgent questions have now been answered—reported on January 10, the results were published by researchers in Nature Communications. Recently, they surprised us by showing us how to implement a thermodynamic version of a neural network.

Whitelam’s truly novel approach is to feed a thermodynamic computer a library of images. This allows you to teach the computer how to create new images. Due to the combination of these factors, it is able to generate highly accurate simulations of handwritten numbers. And researchers were able to run the required, highly complex simulations on traditional computers. They reported their discovery in Physical Review Letters on January 20.

The implications of this research are significant. Whitelam found that thermodynamic computing is well suited to producing images with remarkable efficiency. It requires just one ten billionth of the energy used by today’s most advanced digital neural networks. For now, thermodynamic computers are still in early stages compared to their digital equivalents. They offer an exciting path toward energy-efficient computing.

Normal Computing, a New York City-based startup, has developed a prototype chip that features eight resonators interconnected through specialized couplers. This new chip is designed to generate high-fidelity images quickly. It does that without deploying energy-hungry digital neural networks or using noisy pseudorandom number generators.

Though optimistic about these advancements, Whitelam recognized the hurdles that remain. He acknowledged that even though the findings looked promising,

“This research suggests that it’s possible to make hardware to do certain types of machine learning — here, image generation — with considerably lower energy cost than we do at present.”

To that end, Whitelam and his colleagues are training neural networks. Unlike their work above, this time they want to reverse the process, training diffusion models to generate brand new images from given prompts. This work is the most ambitious yet toward applying thermodynamic computing to real world use cases for machine learning.

“We don’t yet know how to design a thermodynamic computer that would be as good at image generation as, say, DALL-E.”

Whitelam and his colleagues have been working on training neural networks to reverse a process that enables diffusion models to generate entirely new images based on specific prompts. This work marks a significant step toward harnessing thermodynamic computing for practical applications in machine learning.