Stephen Whitelam is a staff scientist at the Lawrence Berkeley National Laboratory in California. He is perhaps best known for his leadership at the forefront of thermodynamic computing. On January 20, he published Riemann-surface simulations in Physical Review Letters. These simulations uncover a novel approach to image generation that has the potential to drastically reduce energy usage compared to conventional AI methods.
Indeed, Whitelam’s research demonstrates that this kind of thermodynamic computer can be trained to produce recognizable images of handwritten digits. This creative approach isn’t reliant on energy-hungry digital neural networks. This novel method uses a training approach that supercharges the computer’s prowess. It does that while seamlessly preventing the typical racket produced by PRNGs. Whitelam’s approach illustrates not only how to make more sustainable computing practices. It runs a unique thermodynamic computer through an initial series of images.
Normal Computing, a New York City-based startup co-founded by Harris, has designed a prototype chip. This silicon chip includes eight of these resonators coupled by specially designed couplers. Using this design, the computer would be able to do everything with only one ten billionth of the energy needed today with traditional approaches. Whitelam emphasized the potential of this technology, stating, “This research suggests that it’s possible to make hardware to do certain types of machine learning — here, image generation — with considerably lower energy cost than we do at present.”
Whitelam and a coauthor published that work in Nature Communications on Jan. 10. This builds on his already impressive body of work published in Physical Review Letters. This article explained how they did it, but what really clinched their victory was creating a thermodynamic version of a neural network. Unlike earlier models, they trained these networks to reverse processes, a key factor that made diffusion models uniquely powerful at generating new images.
Even with these advances, though, Whitelam warns that thermodynamic computers are still in a primitive stage relative to digital neural networks. He explained, “We don’t yet know how to design a thermodynamic computer that would be as good at image generation as, say, DALL-E.” He readily admits that more research is needed to fine-tune the hardware that will make this ambitious goal possible. “It will still be necessary to work out how to build the hardware to do this,” he added.
Whitelam’s other, ongoing work is focused on building new thermodynamic computers that could solve difficult problems without breaking the bank in energy costs. Through this discovery, we’re on the cusp of changing the state-of-the-art for AI image generation and many other machine learning powered applications. It could bring in a golden age of perfect compute.

