Tuesday, December 10th 2024

Google Puts Error Correction First with the Latest "Willow" 105-Qubit Quantum Processor

Google Quantum AI lab has announced a huge advancement in quantum computing with its new Willow processor. The chip demonstrated remarkable error correction capabilities and computational power beyond traditional supercomputers, including ExaFLOP machines such as Frontier and El Capitan. In research published in Nature, Google's team showed that Willow can exponentially reduce error rates as more qubits are added to the system—a feat known as operating "below threshold" that has been a major challenge in quantum computing since 1995. Using arrays of quantum bits in increasingly larger grids, the team successfully cut error rates in half with each expansion. The chip's performance is particularly recorded in random circuit sampling (RCS), where Willow completed calculations in under five minutes. Something like that would take today's fastest supercomputer approximately ten septillion years to solve—a timespan far greater than the universe's age.

Manufactured at Google's specialized facility in Santa Barbara, the 105-qubit Willow chip also achieved impressive coherence times, with qubits maintaining their quantum states for up to 100 microseconds—five times longer than previous generations. Dr. Hartmut Neven, who founded Google Quantum AI in 2012, emphasized that the breakthrough brings quantum computing closer to scaling into more complex systems for data processing. Potential applications include discovering new medicines, designing more efficient batteries for electric vehicles, and accelerating fusion energy research. The next challenge for Google's quantum team is demonstrating a "useful, beyond-classical" computation that addresses practical applications. While Willow has shown superior performance in benchmark tests, researchers are now focused on developing algorithms that can tackle commercially relevant problems that are impossible for traditional computers to solve.
Source: Google
Add your own comment

8 Comments on Google Puts Error Correction First with the Latest "Willow" 105-Qubit Quantum Processor

#1
bgx
For those who dont understand Quantum Computing:

It should be understood as a very specific computational model doing some tasks (<10: quantum annealing, factorisation (so RSA is in danger - that's why there are post quantic crypto), simulating a quantum computer (yeah, this one is absolute garbage - but that's what they used to claim "supremacy")) phenomenaly well, and other extremely poorly.

On these tasks, it crushes everything. Does not change the challenges to make things work, and the limited usefulness in the end.
Posted on Reply
#2
AleksandarK
News Editor
Not to be paranoid but Google really confirmed we are in a multiverse.
"Quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch."
Posted on Reply
#3
MacZ
AleksandarKNot to be paranoid but Google really confirmed we are in a multiverse.
"Quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch."
The multiverse hypothesis is an hypothesis. It is, at best, an interpretation of quantum mechanics. It has zero proof and zero data. It is a metaphysical escape from the reality that our universe seems fine tuned for our kind of life.
Posted on Reply
#4
ncrs
bgxfactorisation (so RSA is in danger - that's why there are post quantic crypto)
So is this implementation able to run Shor's algorithm on non-trivial numbers without any trickery or not?
Posted on Reply
#5
AleksandarK
News Editor
MacZThe multiverse hypothesis is an hypothesis. It is, at best, an interpretation of quantum mechanics. It has zero proof and zero data. It is a metaphysical escape from the reality that our universe seems fine tuned for our kind of life.
Yup I understand that very well. However, how can you measure something you can barely harness? Id love to see more research there. Folks are definitely onto something :)
Posted on Reply
#6
Onasi
@AleksandarK
It's an impossible thing to prove or disprove with our current understanding of physics. It's essentially a thought exercise alongside the likes of "what preceded the Big Bang" or "is the actual (not just observable) universe infinite" or "what's beyond the event horizon of a black hole".
Posted on Reply
#7
bgx
ncrsSo is this implementation able to run Shor's algorithm on non-trivial numbers without any trickery or not?
able probably.

on large number: well of course, that's where the issue is, limited qubits means limited number factorization.

My guess would be: if they were able to factor anything very large, we would have known by now.
Posted on Reply
#8
ncrs
bgxable probably.

on large number: well of course, that's where the issue is, limited qubits means limited number factorization.

My guess would be: if they were able to factor anything very large, we would have known by now.
Which basically brings us to: "no"?
Personally I draw the line of being called a "quantum computer" at being able to run actual established quantum algorithms for non-trivial inputs. As far as I know the highest successful factorization run without any tricks is 21.
All quantum research is welcome, but grandiose claims that usually turn out to be full of caveats are only going to fuel the "it's all a scam" crowd.
Posted on Reply
Dec 11th, 2024 18:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts