• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Quantinuum's H1 Quantum Computer Successfully Executes a Fully Fault-tolerant Algorithm

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,651 (0.99/day)
Fault-tolerant quantum computers that offer radical new solutions to some of the world's most pressing problems in medicine, finance and the environment, as well as facilitating a truly widespread use of AI, are driving global interest in quantum technologies. Yet the various timetables that have been established for achieving this paradigm require major breakthroughs and innovations to remain achievable, and none is more pressing than the move from merely physical qubits to those that are fault-tolerant.

In one of the first meaningful steps along this path, scientists from Quantinuum, the world's largest integrated quantum computing company, along with collaborators, have demonstrated the first fault-tolerant method using three logically-encoded qubits on the Quantinuum H1 quantum computer, Powered by Honeywell, to perform a mathematical procedure.




Fault-tolerant quantum computing methods are expected to open the way for practical solutions to real-world problems across domains such as molecular simulation, artificial intelligence, optimization, and cybersecurity. Following a succession of important breakthroughs in recent years in hardware, software and error correction, today's results announced by Quantinuum in a new paper on the arXiv, "Fault-Tolerant One-Bit Addition with the Smallest Interesting Colour Code" are a natural step forward, and reflect the growing pace of progress.

Many companies and research groups are focused on achieving fault-tolerance by handling the noise that naturally arises when a quantum computer performs its operations. Quantinuum is a proven pioneer, achieving previous firsts such as demonstrating entangling gates between two logical qubits in a fully fault-tolerant manner using real-time error correction, and simulating the hydrogen molecule with two logically-encoded qubits.

By performing one-bit addition using the smallest-known fault-tolerant circuit, the team achieved an error rate almost an order of magnitude lower, at ~1.1x10-3 compared to ~9.5x10-3 for the unencoded circuit. The error suppression observed was made possible by the physical error rates of the quantum charge-coupled device (QCCD) architecture used in Quantinuum's H-Series quantum computers, which are lower than in any other systems known to date. These error rates fall within the range at which fault-tolerant algorithms become feasible.

Ilyas Khan, Chief Product Officer and Founder at Quantinuum, said: "In addition to continuing to provide the quantum ecosystem with evidence of what is possible in these early days of quantum computing, the current demonstration is noteworthy for its ingenuity. The ion trap architecture of our H-Series offers the lowest physical error rates and the flexibility derived from qubit transport, which allows users of our hardware to implement a much wider choice of error-correcting codes, and that is what made this possible. Watch out for further important computational advances in the coming period as we link up the quality of our hardware with tasks that are meaningful in the real world."

Low-overhead logical Clifford gates, in combination with the transversal CCZ gate of the three-dimensional colour code, enabled the team to reduce the number of two-qubit gates and measurements required for one-bit addition, from over 1000, to 36.

Ben Criger, Senior Research Scientist at Quantinuum, and principal investigator on the paper, said: "The CCZ gate, which we've demonstrated here, is a key ingredient in Shor's algorithm, quantum Monte Carlo, topological data analysis, and a host of other quantum algorithms. This result proves that real hardware is now capable of running all the essentials of fault-tolerant quantum computing - state preparation, Clifford gates, non-Clifford gates and logical measurement - together."

View at TechPowerUp Main Site
 
Joined
Jul 5, 2013
Messages
28,258 (6.75/day)
This is a thing! Fully fault tolerant quantum computing? This is effectively Error Correction for QCPU's. These folks may have just done something very important.
 
Joined
Aug 30, 2006
Messages
7,223 (1.08/day)
System Name ICE-QUAD // ICE-CRUNCH
Processor Q6600 // 2x Xeon 5472
Memory 2GB DDR // 8GB FB-DIMM
Video Card(s) HD3850-AGP // FireGL 3400
Display(s) 2 x Samsung 204Ts = 3200x1200
Audio Device(s) Audigy 2
Software Windows Server 2003 R2 as a Workstation now migrated to W10 with regrets.
Previously 1% error rate. Now 0.1% error rate. I guess the term should be less faulty. I would be concerned about the type of algorithm used. For some modelling, when looking at averages, expectations or the central part of the modelled distributions, this is excellent news and provides reliable results. For others, when looking at extremes or rare case events, a 0.1% error rate could be massive and will give you garbage output.
 
Joined
Jan 3, 2021
Messages
3,605 (2.49/day)
Location
Slovenia
Processor i5-6600K
Motherboard Asus Z170A
Cooling some cheap Cooler Master Hyper 103 or similar
Memory 16GB DDR4-2400
Video Card(s) IGP
Storage Samsung 850 EVO 250GB
Display(s) 2x Oldell 24" 1920x1200
Case Bitfenix Nova white windowless non-mesh
Audio Device(s) E-mu 1212m PCI
Power Supply Seasonic G-360
Mouse Logitech Marble trackball, never had a mouse
Keyboard Key Tronic KT2000, no Win key because 1994
Software Oldwin
Previously 1% error rate. Now 0.1% error rate. I guess the term should be less faulty. I would be concerned about the type of algorithm used. For some modelling, when looking at averages, expectations or the central part of the modelled distributions, this is excellent news and provides reliable results. For others, when looking at extremes or rare case events, a 0.1% error rate could be massive and will give you garbage output.
Follow the wording closely: "These error rates fall within the range at which fault-tolerant algorithms become feasible." Noise causes errors [in analog signal processing], less noise causes fewer errors, but apparently faults are not the same as errors, and these researchers lowered the error rate enough so as to make some qualitative leap. Other than that, I cannot comment on magic analog computers I know too little about.
 
Top