A verifiable quantum advantage

October 22, 2025

Xiao Mi and Kostyantyn Kechedzhi, Research Scientists, Google Quantum AI

In our latest Nature publication, we introduce a new quantum computational task measuring Out-of-Time-Order Correlators (OTOCs). This work demonstrates a verifiable quantum advantage and paves the way for solving real-world problems like Hamiltonian learning in Nuclear Magnetic Resonance (NMR).

Nature is brimming with chaos, a phenomenon characterized by the high sensitivity of a system toward small perturbations. In the macroscopic world, notable examples of chaotic systems include weather patterns, wherein a small change in initial conditions leads to vastly different outcomes over time (often dubbed “the butterfly effect”), and population dynamics, where small shifts in local populations may eventually affect the entire ecosystem. Chaos is similarly abundant in quantum systems, with examples including the dynamics of magnetization of atomic nuclei when subjected to a time-varying magnetic field, and the flow of electrons in high-temperature superconductors.

Simulating quantum-chaotic systems is challenging for classical computation due to exponentially scaling computational cost, making quantum computers ideal for achieving quantum advantage. In 2019, we demonstrated the first beyond-classical quantum computation by sampling bitstrings from a highly chaotic quantum state of qubits. However, this random circuit sampling approach has limited practical utility since the same bitstring never appears twice in a large quantum system, restricting its ability to reveal useful information.

In “Observation of constructive interference at the edge of quantum ergodicity”, featured on the cover of Nature, we introduce and experimentally demonstrate a quantum algorithm which we call Quantum Echoes. The heart of the algorithm is measuring the expectation value of a quantum observable, called the out-of-time-order correlator (OTOC). OTOC and its higher order generalizations are a new family of observables that describe how quantum dynamics become chaotic. Unlike bitstrings, quantum expectation values, e.g., current, velocity, magnetization and density, are verifiable computational outcomes that remain the same when run on different quantum computers. The wide relevance of expectation values combined with their verifiability indicates a direct path toward using OTOCs to solve real-world problems using quantum computers, which are not possible to solve on classical computers. Remarkably, we show that running the Quantum Echoes algorithm on the Willow quantum chip is already in the beyond-classical regime for a set of benchmarking quantum circuits.

Out-of-time-order correlator

In practice, OTOC represents the state of a single qubit at the end of a series of quantum operations. In our experiments running Quantum Echoes on Willow, a total of 103 qubits underwent both “forward” (U) and “backward” (U) evolutions in the form of random quantum circuits. A forward evolution applied to a state where all qubits are independent from each other brings the system to a highly chaotic state with quantum correlations across all qubits. A perturbation, a one-qubit operation B, is applied to a qubit in between the two time evolutions. This circuit is followed by another probe, a one-qubit operation M. Repeating this process once or twice leads to an OTOC of first or second order. In absence of B the forward and backward evolution returns the system to the initial state, where all qubits are independent. Inclusion of the perturbation B sets off a butterfly effect: after such perturbed forward and backward evolution, the whole system ends in a chaotic state with quantum correlations across all qubits that is very different from the initial state.

A crucial insight we obtained from our experiments is that higher-order OTOCs exhibit complex quantum interference effects analogous to a traditional interferometer. This is known as many-body interference, meaning the quantum states of many particles interfere with each other, much like waves of water might interfere, leading to complex overall effects. Here the perturbations, B and M, act as imperfect mirrors that modify the system’s trajectories. Higher order OTOCs become more sensitive to the perturbation due to increasing number of “round trip” evolutions, where the trajectories bounce off of B and M. When a resonance condition is satisfied, which corresponds to evolution U being the exact inverse of U, the interference is constructive and it amplifies the subset of quantum correlations from the totality of those present in the chaotic state. More specifically, this interferometry reveals how the evolution U generates correlations between the two qubits where operations B and M were applied. It can be used as a sensitive instrument to characterize the evolution of U.

OTOC-1

Left: Quantum circuit measuring OTOCs of different orders, k. Qubits are initiated in the ground state, with one qubit in the state denoted by |𝜓M〉. A complex many-body evolution (U) is implemented by the quantum processor, consisting of one- & two-qubit operations applied to neighboring qubits on a two-dimensional grid. The evolution is reversed (U) after perturbing one qubit with gate B, followed by a probe operation M on the initially prepared qubit |𝜓M〉. This repeats k times before measuring another qubit M. Right: Conceptual representation of OTOCs of different order as interferometers.

The interference nature of the OTOC leads to two consequences crucial for attaining quantum advantage. First, the forward and backward evolutions partially reverse the effects of chaos and amplify the quantum signal measured at the end. We observed the signature of this amplification in OTOC signals. More specifically, OTOC signal magnitude, characterized by the width of the distribution of OTOC values over the ensemble of random circuits, scales as a negative power of time, whereas quantum signals measured without back evolutions decay exponentially. The slow power law decay of OTOCs suggests that measuring these quantities on a quantum computer is significantly more efficient than classical simulations, where costs increase exponentially over time.

OTOC-2

Left: Time-dependent values of signals measured without time inversion (gray) and with time inversion (magenta, blue, green). The vertical axis shows standard deviation over random circuits for correlation function, C(1), and the first/second order OTOCs, C(2) and C(4). Right: A set of 2nd order OTOC values measured on a Willow device that are estimated to require 3.2 years to simulate each data point on the Frontier supercomputer. The horizontal axis labels instances of random circuits. A total of 65 qubits out of the 105 available qubits are used in this experiment.

The computational gap between quantum and classical processors

The second consequence of many-body interference is classical complexity. A central task for quantum computing is to identify the computational cost gap between quantum and classical computers on specific computational tasks. We approached this in two ways: (1) through a combination of theoretical analysis and experiments, we revealed the fundamental obstacles to known classical algorithms in achieving the same outcome as our OTOC calculations on Willow, and (2) we tested the performance of nine relevant classical simulation algorithms by direct implementation and cost estimation.

In the first approach we identified that quantum interference is an obstacle for classical computation. A distinct characteristic of quantum mechanics is that predicting an outcome of an experiment requires analyzing probability amplitudes rather than probabilities as in classical mechanics. A well known example is the entanglement of light that manifests in quantum correlations between photons, elementary particles of light, that persist over long distances (2022 Physics Nobel Laureates) or macroscopic quantum tunneling phenomena in superconducting circuits (2025 Physics Nobel Laureates).

The interference in our second order OTOC data (i.e., an OTOC that runs through the backward and forward circuit loop twice) reveals a similar distinction between probabilities and probability amplitudes. Crucially, probabilities are non-negative numbers, whereas probability amplitudes can be of an arbitrary sign and are described by complex numbers. Taken together, these features mean they contain a much more complex collection of information. Instead of a pair of photons or a single superconducting junction, our experiment is described by probability amplitudes across an exponentially large space of 65 qubits. An exact description of such a quantum mechanical system requires storing and processing 265 complex numbers in memory, which is beyond the capacity of supercomputers. Moreover, quantum chaos in our circuits ensures that every amplitude is equally important, and therefore algorithms using a compressed description of the system require memory and processing time beyond the capacity of supercomputers.

Our further theoretical and experimental analysis revealed that carefully accounting for the signs of the probability amplitudes is necessary to predict our experimental data by a numerical calculation. This presents a significant barrier for a class of efficient classical algorithms, quantum Monte Carlo, that have been successful at describing quantum phenomena in a large quantum mechanical space (e.g., superfluidity of liquid Helium-4). These algorithms rely on description in terms of probabilities, yet our analysis demonstrates that such approaches would result in an uncontrollable error in the computation output.

Our direct implementation of algorithms relying on both compressed representation and efficient quantum Monte Carlo confirmed the impossibility of predicting second-order OTOC data. Our experiments on Willow took approximately 2 hours, a task estimated to require 13,000 times longer on a classical supercomputer. This conclusion was reached after an estimated 10 person years spent in classical red teaming of our quantum result, implementing a total of nine classical simulation algorithms as a result.

Practical application of OTOC circuits

Having established the beyond-classical complexity of OTOCs, we began exploring how they could be applied to solving real-world problems of practical interest. To this end, we proposed Hamiltonian learning, a scheme where the quantum computer simulates OTOC signals from a physical system in nature, such as molecules, whose system parameters are not fully known. Then, we compare the quantum computer OTOC signals against real-world data about the physical system and observe when they best agree. By looking for this agreement, we aim to obtain a more precise estimate of system parameters than what is possible through other techniques.

To make this scheme practical, we have to find systems in nature that can perform our Quantum Echoes algorithm, and simulate these systems on our quantum hardware. As a step toward this goal, in "Quantum computation of molecular geometry via many-body nuclear spin echoes”, we show that we tested this concept using nuclear magnetic resonance (NMR) spectroscopy. In NMR, one uses the precession of nuclear spins in a large magnetic field to learn the structure of molecules and materials, like the proteins in your body or the battery components in your phone. Nuclear spins obey the laws of quantum mechanics, and under certain conditions (namely in solids or solid-like materials) they demonstrate the same quantum-chaotic behavior described above. This makes them a perfect candidate for the OTOC protocol.

In this pre-print, which will be submitted for peer review, we measured OTOCs on two organic molecules dissolved in liquid crystal at the Pines Magnetic Resonance Center at UC Berkeley. This experiment was then simulated on our Willow chip, resulting in improved models of the molecular structure. Due to the inherent complexity of simulating real-world systems and performance limits of our current chip, this initial demonstration is not yet beyond classical. However, our results demonstrate sensitivity to molecular details and we're confident that this path will lead to some of the first useful applications of quantum computation.

OTOC-3

A schematic for refining knowledge of a physical quantum system through the quantum computer, known as Hamiltonian learning.

Conclusion

We have performed the first quantum computing experiment measuring a quantum observable that is both verifiable through another quantum computer or a natural quantum system, and beyond the simulation capacity of known classical algorithms. This experiment was made possible by our recent hardware advancement, and paves the way toward the first real-world application of quantum computers in probing the microscopic structures of physical systems such as molecules.

Acknowledgements

This work involved many members of the Quantum AI team, along with Google DeepMind and external collaborators at UC Berkeley, Dartmouth College, QSimulate, and NVIDIA.