Breakthrough in Thermodynamic Computing Promises Energy Savings for AI Image Generation

A new study led by Stephen Whitelam, a staff scientist at Lawrence Berkeley National Laboratory, describes promising breakthroughs in thermodynamic computing. These advances have the potential to change the game around energy use in AI-based image generation. Whitelam and one of his colleagues, veteran trainer Craig Cowan, coordinated to present a five-part training regimen in…

Tina Reynolds Avatar

By

Breakthrough in Thermodynamic Computing Promises Energy Savings for AI Image Generation

A new study led by Stephen Whitelam, a staff scientist at Lawrence Berkeley National Laboratory, describes promising breakthroughs in thermodynamic computing. These advances have the potential to change the game around energy use in AI-based image generation. Whitelam and one of his colleagues, veteran trainer Craig Cowan, coordinated to present a five-part training regimen in an eye-catching manner. To demonstrate the power of their technology, they produced a thermodynamic computer that produces images of handwritten digits.

Then on January 10, Whitelam and his collaborators shared the evidence of their discoveries in an open-access article in Nature Communications. This project marks the public debut of a prototype chip created by Normal Computing, a New York City startup. The chip is made up of eight identical resonators, linked together via custom couplers, an essential advancement towards adding thermodynamic computing to the list of useful technologies.

Thermodynamic computers, the researchers note, are still in a relatively primitive stage compared to the maturity of digital neural networks. They do come with some pretty amazing benefits. Whitelam noted that thermodynamic computing could generate images using just one ten billionth of the energy consumed by contemporary digital hardware.

“This research suggests that it’s possible to make hardware to do certain types of machine learning — here, image generation — with considerably lower energy cost than we do at present,” said Whitelam.

Through a unique training process adopted by the team, diffusion models can generate novel images from scratch based on user prompts. Unlike digital neural networks, thermodynamic computers would work fundamentally differently than typical. They sidestep energy-intensive operations and remove the need for pseudorandom number generators that produce noise. This feature makes thermodynamic computing a more sustainable option for image generation tasks.

Whitelam knew the stormy seas that awaited him. For example, Chaudhuri said, “We still don’t really understand how to design a thermodynamic computer that would achieve DALL-E’s ability to generate images. This frank acceptance reveals the fundamental R&D still needed to improve the capabilities of these emerging thermodynamic computing systems.

In order to back up their results, Whitelam released simulations carried out on standard desktop computers on January 20 in Physical Review Letters. These simulations continued to prove the feasibility of training neural networks to reverse processes and eventually generate images.

Even with these limitations, the promise of reduced energy use makes thermodynamic computing an interesting space to watch for future developments. In order to optimize this vision of the future, as research advances, Whitelam emphasized that we need to create the hardware to maximize these benefits.

“It will still be necessary to work out how to build the hardware to do this,” he concluded.