Raevenlord
News Editor
- Joined
- Aug 12, 2016
- Messages
- 3,755 (1.24/day)
- Location
- Portugal
System Name | The Ryzening |
---|---|
Processor | AMD Ryzen 9 5900X |
Motherboard | MSI X570 MAG TOMAHAWK |
Cooling | Lian Li Galahad 360mm AIO |
Memory | 32 GB G.Skill Trident Z F4-3733 (4x 8 GB) |
Video Card(s) | Gigabyte RTX 3070 Ti |
Storage | Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB |
Display(s) | Acer Nitro VG270UP (1440p 144 Hz IPS) |
Case | Lian Li O11DX Dynamic White |
Audio Device(s) | iFi Audio Zen DAC |
Power Supply | Seasonic Focus+ 750 W |
Mouse | Cooler Master Masterkeys Lite L |
Keyboard | Cooler Master Masterkeys Lite L |
Software | Windows 10 x64 |
Honeywell, a multinational conglomerate specializing in the quantum computing field, today announced they have created the world's most advanced quantum computer. Their new solution brings about a quantum computing volume set at 64 - twice the quantum volume of the world's previous most powerful quantum computer, the IBM Raleigh. You might be looking at that 64 quantum volume, wondering what that means - and where did the qubits metric go. Well, the thing with quantum computers is that the number of qubits can't really be looked at as a definite measure of performance - instead, it's just a part of the "quantum volume" calculation, which expresses the final performance of a quantum system.
When you make operations at the quantum level, a myriad of factors come into play that adversely impact performance besides the absolute number of qubits, such as the calculation error rate (ie, how often the system outputs an erroneous answer to a given problem) as well as the qubit connectivity level. Qubit connectivity expresses a relationship between the quantum hardware capabilities of a given machine and the ability of the system to distribute workloads across qubits - sometimes the workloads can only be distributed to two adjacent qubits, other times, it can be distributed to qubits that are more far apart within the system without losing data coherency and without affecting error rates - thus increasing performance and the systems' flexibility towards processing workloads. If you've seen Alex Garland's Devs series on Hulu (and you should; it's great), you can see a would-be-quantum computer and all its intricate connections. Quantum computers really are magnificent crossovers of science, materials engineering, and computing. Of course, the quantum computing arms race means that Honeywell's system will likely be dethroned by quantum volume rather soon.
View at TechPowerUp Main Site
When you make operations at the quantum level, a myriad of factors come into play that adversely impact performance besides the absolute number of qubits, such as the calculation error rate (ie, how often the system outputs an erroneous answer to a given problem) as well as the qubit connectivity level. Qubit connectivity expresses a relationship between the quantum hardware capabilities of a given machine and the ability of the system to distribute workloads across qubits - sometimes the workloads can only be distributed to two adjacent qubits, other times, it can be distributed to qubits that are more far apart within the system without losing data coherency and without affecting error rates - thus increasing performance and the systems' flexibility towards processing workloads. If you've seen Alex Garland's Devs series on Hulu (and you should; it's great), you can see a would-be-quantum computer and all its intricate connections. Quantum computers really are magnificent crossovers of science, materials engineering, and computing. Of course, the quantum computing arms race means that Honeywell's system will likely be dethroned by quantum volume rather soon.
View at TechPowerUp Main Site