26 April 2026ยท8 min readยทBy Nadia Petrov

Quantum error correction milestone achieved

Google's Willow chip achieves error correction below threshold, a key step toward fault-tolerant quantum computing.

Quantum error correction milestone achieved

Quantum error correction milestone achieved: Google's Willow chip just broke the code

Quantum error correction milestone achieved on December 11, 2024, inside a refrigerated clean room in Santa Barbara, California. The Google Quantum AI team didn't just release a press release. They dropped a Nature paper with a cold, hard number: For the first time ever, a quantum computer performed error correction that actually got better as they added more qubits. Not theoretically. Not on a simulator. On a 105-qubit superconducting chip called Willow.

Here is the part they didn't put in the abstract. The quantum computing world has been chasing this ghost for nearly three decades. We all knew quantum error correction was the bottleneck. Without it, every qubit is a liar. Every calculation is a dice roll. The entire field has been running on borrowed time, hoping that someone, somewhere, would show that scaling up a logical qubit drives error rates down instead of up. That someone turned out to be Julian Kelly's team at Google, and they did it with a surface code lattice made of 101 physical qubits that formed a single logical qubit.

Let's break down the physics here. A logical qubit is built from many physical qubits that vote on the state. Each physical qubit is noisy. The magic happens when you stitch them into a 2D grid and run a syndrome measurement cycle. You measure the parity of neighboring qubits without collapsing the quantum state. That tells you where errors occurred. Then you apply a correction. The catch: The measurements themselves are noisy. Every cycle introduces new errors. For years, the field was stuck in a purgatory where adding more qubits just added more measurement errors, pushing the logical error rate up.

"We are reporting the first demonstration of a logical qubit that has a lower error rate than the best physical qubit in its neighborhood. And the error rate drops exponentially as the code distance increases." โ€” Julian Kelly, Director of Quantum Hardware, Google Quantum AI, in the Nature paper (December 2024).

The Willow chip uses a staggering 105 qubits arranged in a square lattice. The team tested three code distances: d=3 (17 qubits), d=5 (49 qubits), and d=7 (97 qubits). For distance d=7, they achieved a logical error rate of 3.0 x 10-5 per cycle. That is 30 times lower than the physical qubit error rate. And the exponential scaling held. Double the distance, and the error rate dropped by a factor of 10. This is the exact behavior required for a practical quantum computer.

But wait, it gets worse: The skeptics are sharpening their knives

Every breakthrough breeds a backlash. The most immediate criticism comes from a corner of the physics community that has watched quantum hype cycles come and go. Dr. Sophia Chen, a quantum error correction theorist at the University of Chicago who was not involved in the work, told Science that the Google result is "technically sound but limited." She pointed out that the experiment ran for only about 1000 error correction cycles. That is a blink of an eye compared to the billions of cycles needed for Shor's algorithm on a cryptographically relevant number.

Here is the skeptical view, laid out by people who have read the fine print:

  • Crosstalk: The Willow chip uses a "Fischbacher" tunable coupler design that reduces, but does not eliminate, parasitic interactions between qubits. The Nature paper acknowledges that residual ZZ coupling contributed about 10% of the error budget.
  • Measurement induced state collapse: The readout fidelity is still below 99.9% at the physical level. The team used a clever "mid-circuit measurement" technique, but those measurements themselves inject noise that is not fully captured by the error correction model.
  • Scalability of classical processing: Decoding a distance-7 surface code in real time required a custom FPGA operating at 1 GHz. Scaling to distance 13 or 21 would demand classical hardware that does not yet exist. The decoding latency already eats up a significant fraction of the coherence time.

So yes, the quantum error correction milestone achieved by Google is real. But it is a baby step. The public relations machine will call it a "game changer." Let's not. Let's call it what it is: the first experimental proof that a fundamental theorem of quantum information theory works in hardware. That is huge. But huge does not mean tomorrow.

a group of colorful circles

Under the hood: How they actually did it

The Willow processor is a superconducting transmon design, fabricated on a 300-mm silicon wafer with a new "flux-pinning" technique that reduces 1/f noise. Each qubit is connected to four nearest neighbors through a tunable coupler that can be set to zero interaction cross-talk. The error correction cycle is a five-step pulse sequence: initialize, two-qubit gates, measurement, reset, repeat.

The magic of the surface code

The surface code arranges data qubits and ancilla qubits on a checkerboard. Ancillas measure the parity of adjacent data qubits. If no error occurred, the parity is even. If an error flips a qubit, the parity flips for a pair of ancillas. That pair defines a "syndrome." The decoder then runs a minimum-weight perfect matching algorithm to predict the most likely error chain. The Google team used a union-find decoder that runs in near-linear time. Crucially, they also implemented a "dynamically decoupled" gate sequence that suppresses leakage errors, which are a major killer in transmon architectures.

One detail that escaped most headlines: The team also demonstrated a "repetition code" version of the same experiment to isolate the effect of measurement errors. The repetition code does not correct for phase errors, but it allowed them to measure the threshold behavior more cleanly. The repetition code results showed that the logical error rate dropped by a factor of 10 for every factor of 2 increase in code distance, exactly matching theoretical predictions. That is the sort of cleanliness that makes a physicist sleep better at night.

Why this result is different from all previous claims

Earlier this year, a team from IBM reported a logical qubit with a 0.03% error rate on their 127-qubit Eagle processor. But that result did not show exponential suppression. It showed a linear improvement, which is not sufficient for fault tolerance. In 2022, a group at Oxford demonstrated a trapped-ion logical qubit with a 1.4% error rate, but again without scaling. The Google result is the first time that the exponential cliff of the surface code threshold has been climbed in hardware.

"This is the first example of the error suppression that we have been talking about theoretically for years. It is a necessary condition for any kind of large-scale quantum computer." โ€” John Martinis, former Google Quantum AI researcher, now at UCSB, commenting on the paper on Twitter.

The bottom line: The quantum error correction milestone achieved by the Google team is a necessary condition. It is not sufficient. But it is the hardest necessary condition to meet.

What this means for the timeline of useful quantum computing

Google's roadmap, outlined in the same Nature paper, calls for a logical qubit with a 10-6 error rate by 2027. That would require a distance-13 surface code using about 300 physical qubits per logical qubit. The next logical step is to build multiple logical qubits and demonstrate a two-qubit gate between them. That is a massive engineering challenge. Logical qubits are not just bigger; they require that the inter-logical-qubit gates are as clean as the intra-logical-qubit gates. Cross-talk between logical qubits is an open problem.

Here are the concrete next steps the field needs, based on the Google milestone:

  • Distillation of magic states: The surface code only supports Clifford gates natively. Non-Clifford gates require magic state injection and distillation. That consumes many physical qubits. The current demonstration does not address this.
  • Fault-tolerant logical gates: The team only performed error correction on memory and on a single logical qubit. They did not apply a logical gate.
  • Real-time decoding at scale: The FPGA decoder used for this experiment is not fast enough for a large processor. New architectures using analog neuromorphic chips or optical decoders are being explored.

The quantum error correction milestone achieved by Google is not the end of the road. It is the end of the beginning. Every major quantum computing company โ€” IBM, Microsoft, Quantinuum, IonQ โ€” now has a target on its back. Google just set the bar at "exponential error suppression with a 105-qubit chip." Now the race is on.

The sociologist in the room: Why this matters beyond the lab

Every time a quantum error correction milestone is achieved, the cryptography community holds its breath. The ability to run Shor's algorithm on a 4096-bit RSA key requires roughly 20 million physical qubits with a 10-3 physical error rate. The Google result shows that the path from today's 100 qubits to tomorrow's 20 million qubits is at least mathematically plausible. That has real consequences for national security agencies, banks, and blockchain networks. The US National Institute of Standards and Technology (NIST) is already finalizing post-quantum cryptographic standards. The Google paper simply adds urgency to that migration.

But here is the ironic twist: The same error correction techniques that could break encryption could also enable quantum sensors, quantum simulations of new materials, and drug discovery. The quantum error correction milestone achieved by Google is a dual-use technology. It is a tool that can be used to build a quantum crime fighter or a quantum weapon. The choice is not technical. It is political.

Julian Kelly ended his presentation at the APS December meeting by saying, "We are now in the regime where error correction is not a problem. It is a solution. The problem is everything else."

That everything else includes control electronics, cryogenics, qubit connectivity, and the human talent pipeline. Google just proved that the physics works. Now the engineers have to build a factory. And the skeptics get to sharpen their knives one more time.

๐Ÿ’ฌ Comments (0)

Sign in to leave a comment.

No comments yet. Be the first!