Tuesday, December 10th 2024
Google Puts Error Correction First with the Latest "Willow" 105-Qubit Quantum Processor
Google Quantum AI lab has announced a huge advancement in quantum computing with its new Willow processor. The chip demonstrated remarkable error correction capabilities and computational power beyond traditional supercomputers, including ExaFLOP machines such as Frontier and El Capitan. In research published in Nature, Google's team showed that Willow can exponentially reduce error rates as more qubits are added to the system—a feat known as operating "below threshold" that has been a major challenge in quantum computing since 1995. Using arrays of quantum bits in increasingly larger grids, the team successfully cut error rates in half with each expansion. The chip's performance is particularly recorded in random circuit sampling (RCS), where Willow completed calculations in under five minutes. Something like that would take today's fastest supercomputer approximately ten septillion years to solve—a timespan far greater than the universe's age.
Manufactured at Google's specialized facility in Santa Barbara, the 105-qubit Willow chip also achieved impressive coherence times, with qubits maintaining their quantum states for up to 100 microseconds—five times longer than previous generations. Dr. Hartmut Neven, who founded Google Quantum AI in 2012, emphasized that the breakthrough brings quantum computing closer to scaling into more complex systems for data processing. Potential applications include discovering new medicines, designing more efficient batteries for electric vehicles, and accelerating fusion energy research. The next challenge for Google's quantum team is demonstrating a "useful, beyond-classical" computation that addresses practical applications. While Willow has shown superior performance in benchmark tests, researchers are now focused on developing algorithms that can tackle commercially relevant problems that are impossible for traditional computers to solve.
Source:
Google
Manufactured at Google's specialized facility in Santa Barbara, the 105-qubit Willow chip also achieved impressive coherence times, with qubits maintaining their quantum states for up to 100 microseconds—five times longer than previous generations. Dr. Hartmut Neven, who founded Google Quantum AI in 2012, emphasized that the breakthrough brings quantum computing closer to scaling into more complex systems for data processing. Potential applications include discovering new medicines, designing more efficient batteries for electric vehicles, and accelerating fusion energy research. The next challenge for Google's quantum team is demonstrating a "useful, beyond-classical" computation that addresses practical applications. While Willow has shown superior performance in benchmark tests, researchers are now focused on developing algorithms that can tackle commercially relevant problems that are impossible for traditional computers to solve.
8 Comments on Google Puts Error Correction First with the Latest "Willow" 105-Qubit Quantum Processor
It should be understood as a very specific computational model doing some tasks (<10: quantum annealing, factorisation (so RSA is in danger - that's why there are post quantic crypto), simulating a quantum computer (yeah, this one is absolute garbage - but that's what they used to claim "supremacy")) phenomenaly well, and other extremely poorly.
On these tasks, it crushes everything. Does not change the challenges to make things work, and the limited usefulness in the end.
"Quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch."
It's an impossible thing to prove or disprove with our current understanding of physics. It's essentially a thought exercise alongside the likes of "what preceded the Big Bang" or "is the actual (not just observable) universe infinite" or "what's beyond the event horizon of a black hole".
on large number: well of course, that's where the issue is, limited qubits means limited number factorization.
My guess would be: if they were able to factor anything very large, we would have known by now.
Personally I draw the line of being called a "quantum computer" at being able to run actual established quantum algorithms for non-trivial inputs. As far as I know the highest successful factorization run without any tricks is 21.
All quantum research is welcome, but grandiose claims that usually turn out to be full of caveats are only going to fuel the "it's all a scam" crowd.