- Thermodynamic computing uses physical energy flows instead of fixed digital circuits to perform AI calculations
- Image data can degrade naturally due to tiny fluctuations in computer components.
- Scaling to complex image generation will require entirely new hardware designs and approaches.
Scientists are exploring a new type of computing that uses natural energy flows to potentially perform AI tasks more efficiently.
Unlike traditional digital computers, which rely on fixed circuits and exact calculations, thermodynamic computing uses chance, noise, and physical interactions to solve problems.
The idea is that this method could allow AI tools, including image editors, to operate using much less power than current systems.
How thermodynamic image generation works
The process of generating thermodynamic images is unusual compared to normal computing. It begins with the computer receiving a set of images, which it then allows to “degrade”.
In this context, defacing does not mean that the images are deleted or damaged; this means that the data in the images can propagate or change naturally due to tiny system fluctuations.
These fluctuations are caused by physical energy flowing through the computer’s components, such as tiny currents and vibrations.
Over time, these interactions cause images to become blurry or noisy, creating a sort of natural clutter. The system then measures the probability of reversing this disorder, adjusting its internal parameters to make reconstruction more likely.
By performing this process several times, the computer gradually restores the original images without following the step-by-step logic used by conventional computers.
Stephen Whitelam, a researcher at Lawrence Berkeley National Laboratory, has demonstrated that thermodynamic computing can produce simple images such as handwritten digits.
These outputs are much simpler than those of AI image generators like DALL-E or Google Gemini’s Nano Banana Pro.
Yet research proves that physical systems can perform basic machine learning tasks, showing a new way that AI could work.
However, scaling this approach to produce high-quality, comprehensive images will require new types of hardware.
Proponents claim that thermodynamic computing could reduce the energy needed to generate AI images by a factor of ten billion compared to standard computers.
If successful, this would significantly reduce the energy consumption of data centers running AI models.
Although the first thermodynamic computing chip has been made, current prototypes are basic and cannot compete with traditional AI tools.
The researchers emphasize that the concept is limited to basic principles and that practical implementations will require breakthroughs in hardware and computing design.
“This research suggests that it is possible to make hardware to perform certain types of machine learning… with significantly lower energy cost than today,” Whitelam said. IEEE.
“We don’t yet know how to design a thermodynamic computer that would be as good at generating images as, say, DALL-E… we’ll still have to figure out how to build the hardware to do that.”
Follow TechRadar on Google News And add us as your favorite source to get our news, reviews and expert opinions in your feeds. Make sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp Also.




