Conventional probabilistic computers, systems built from standard components that use randomly flipping “probabilistic bits” to explore solution spaces, can match or surpass the performance of a large quantum annealer on a flagship optimisation problem, according to a study published in Nature Communications.
According to the international research team includes scientists from Italy, the United States, the Netherlands and Poland, the machines operate at room temperature, avoid cryogenic infrastructure and can be scaled with familiar hardware.
Optimisation mathematics underpins logistics, manufacturing and energy systems. The authors describe these tasks as problems in which the number of valid configurations “grows faster than they can be calculated and analysed,” making it difficult for conventional systems to find the true optimum even though they can reach reasonable averages quickly.
Quantum computers were expected to offer advantages on these so-called combinatorial problems. Research often focuses on Ising models - grids of binary “spins” that interact with neighbours. Each spin can take the value +1 or –1, and finding the configuration with the lowest energy corresponds to identifying the best solution to a complex yes/no decision pattern.
The new study examines one of the field’s best-known challenges: the 3D spin glass, with Marek M. Rams from the Jagiellonian University in Kraków contributing analysis of spin-glass physics and system behaviour.
In this system, interactions between spins randomly favour either alignment or opposition, creating an energy landscape with many local minima. D-Wave has previously reported that its quantum annealer scales more favourably than classical algorithms on this benchmark.
Quantum annealing is typically compared to controlled cooling. A system begins in a highly fluctuating quantum state and the strength of those fluctuations is gradually reduced. If the process succeeds, the machine settles into a low-energy configuration representing a strong solution to the optimisation problem.
Rather than using a quantum device, the authors implemented quantum annealing simulations on classical processors. Their discrete simulated quantum annealing (DT-SQA) approach replaces the quantum system with thousands of classical replicas connected along an additional time dimension. “Instead of having a single lattice of magnets, we have, for example, several thousand of them, interconnected to mimic the quantum effect,” the team wrote.
The researchers also deployed adaptive parallel tempering (APT), which runs multiple copies of the same model at different temperatures. High-temperature copies jump broadly across the solution landscape, while low-temperature copies refine details. Periodic swaps help prevent the system from becoming stuck in local minima. The method was further strengthened with cluster moves that flip entire groups of interconnected spins without changing total energy, a step the authors compare to shuffling a puzzle to reveal better configurations.
The strongest gains emerged when these algorithms were executed on probabilistic hardware. A probabilistic bit, or p-bit, continuously switches between –1 and +1 “like a coin tossed in the air,” with its bias determined by its neighbours. Networks of p-bits naturally emulate Ising systems and can drift toward low-energy states without relying on quantum effects.
The study notes that p-computers can be implemented on CPUs, GPUs, FPGAs and potentially future hybrid CMOS–nanomagnet platforms. All provide extensive parallelism, allowing large numbers of p-bits to be updated simultaneously.
To compare performance with D-Wave’s quantum annealer, the researchers tracked residual energy, the difference between the current solution and the theoretical optimum, and measured how this value declined over time. With thousands of parallel replicas, DT-SQA produced improvement curves “comparable to, or even better than, that of a physical quantum annealer” on the same 3D problem.
APT with clustering showed the strongest results. Initially, improvement was modest, but performance then accelerated, ultimately reducing residual energy faster and producing better long-term solutions than DT-SQA. Tests across several system sizes suggest the effect is stable and may extend to larger, more complex problems.
The findings feed into a broader debate over quantum advantage: how much of a quantum machine’s performance stems from quantum physics and how much comes from engineering choices. Quantum processors must operate near absolute zero and remain vulnerable to decoherence, requiring large cryogenic systems and extensive error-correction hardware. These overheads can limit practical gains.
Probabilistic computers avoid such constraints. According to the authors, the energy required to update a single p-bit can be “several orders of magnitude lower” than the energy consumed by GPUs or TPUs performing comparable calculations. The hardware also scales readily and can be tailored to specific problem classes.
The study illustrates the role of Polish institutions in ongoing efforts to establish reliable criteria for quantum advantage and determine which classical approaches must be exhausted before any breakthrough claim can be substantiated.
(PAP)
PAP - Science in Poland
kmp/ agt/
tr. RL