27.3 C
Miami
Tuesday, May 20, 2025

Could Thermodynamic Computing Revolutionize AI and Scientific Research?

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

A new computing paradigm—thermodynamic computing—has entered the scene. Okay, okay, maybe it’s just probabilistic computing by a new name. They both use noise (such as that caused by thermal fluctuations) instead of fighting it, to perform computations. But still, it’s a new physical approach.

“If you’re talking about computing paradigms, no, it’s this same computing paradigm,” as probabilistic computing, says Behtash Behin-Aein, the CTO and founder of probabilistic computing startup Ludwig Computing (named after Ludwig Boltzmann, a scientist largely responsible for the field of, you guessed it, thermodynamics). “But it’s a new implementation,” he adds.

In a recent publication in Nature Communications, New York-based startup Normal Computing detailed their first prototype of what they call a thermodynamic computer. They’ve demonstrated that they can use it to harness noise to invert matrices. They also demonstrated Gaussian sampling, which underlies some AI applications.

How Noise Can Aid Some Computing Problems

Conventionally, noise is the enemy of computation. However, certain applications actually rely on artificially generated noise. And using naturally occurring noise can be vastly more efficient.

“We’re focusing on algorithms that are able to leverage noise, stochasticity, and non-determinism,” says Zachery Belateche, silicon engineering lead at Normal Computing. “That algorithm space turns out to be huge, everything from scientific computing to AI to linear algebra. But a thermodynamic computer is not going to be helping you check your email anytime soon.”

For these applications, a thermodynamic—or probabilistic—computer starts out with its components in some semi-random state. Then, the problem the user is trying to solve is programmed into the interactions between the components. Over time, these interactions allow the components to come to equilibrium. This equilibrium is the solution to the computation.

This approach is a natural fit for certain scientific computing applications that already include randomness, such as Monte-Carlo simulations. It is also well suited for AI image generation algorithm stable diffusion, and a type of AI known as probabilistic AI. Surprisingly, it also appears to be well-suited for some linear algebra computations that are not inherently probabilistic. This makes the approach more broadly applicable to AI training.

“Now we see with AI that paradigm of CPUs and GPUs is being used, but it’s being used because it was there. There was nothing else. Say I found a gold mine. I want to basically dig it. Do I have a shovel? Or do I have a bulldozer? I have a shovel, just dig,” says Mohammad C. Bozchalui, the CEO and co-founder of Ludwig Computing. “We are saying this is a different world which requires a different tool.”

Normal Computing’s Approach

Normal Computing’s prototype chip, which they termed the stochastic processing unit (SPU), consists of eight capacitor-inductor resonators and random noise generators. Each resonator is connected to each other resonator via a tunable coupler. The resonators are initialized with randomly generated noise, and the problem under study is programmed into the couplings. After the system reaches equilibrium, the resonator units are read out to obtain the solution.

“In a conventional chip, everything is very highly controlled,” says Gavin Crooks, a staff research scientist at Normal Computing. “Take your foot off the control little bit, and the thing will naturally start behaving more stochastically.”

Although this was a successful proof-of-concept, the Normal Computing team acknowledges that this prototype is not scalable. But they have amended their design, getting rid of tricky-to-scale inductors. They now plan to create their next design in silico, rather than on a printed circuit board, and expect their next chip to come out later this year.

How far this technology can be scaled remains to be seen. The design is CMOS-compatible, but there is a lot to be worked out before it can be used to solve large-scale real-world problems. “It’s amazing what they have done,” Bozchalui of Ludwig Computing says. “But at the same time, there is a lot to be worked to really take it from what is today to commercial product to something that can be used at the scale.”

A Different Vision

Although probabilistic computing and thermodynamic computing are essentially the same paradigm, there is a cultural difference. The companies and researchers working on probabilistic computing almost exclusively trace their academic roots to the group of Supryo Datta at Purdue University. The three cofounders of Normal Computing, however, have no ties to Purdue and come from backgrounds in quantum computing.

This results in the Normal Computing cofounders having a slightly different vision. They imagine a world where different kinds of physics are utilized for their own computing hardware, and every problem that needs solving is matched with the most optimal hardware implementation.

“We coined this term physics-based ASICs,” Normal Computing’s Belateche says, referring to application-specific integrated circuits. In their vision, a future computer will have access to conventional CPUs and GPUs, but also a quantum computing chip, a thermodynamic computing chip, and any other paradigm people might dream up. And each computation will be sent to an ASIC that uses the physics that’s most appropriate for the problem at hand.

From Your Site Articles

Related Articles Around the Web

Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img