Microsoft and Quantinuum, in an effort claimed to herald a new age of quantum computing, have demonstrated logical qubits with error rates much lower (up to 800 times) than physical qubits.
In strategic partnership with Quantinuum, a company in the field of quantum computing, Microsoft publicized that a breakthrough has been achieved in quantum error correction that pushes the field into a new realm of quantum computing.
Krysta Svore, vice president of advanced quantum development at Microsoft Corp, shared that existing quantum computers exist at the base level of quantum computing: a stage termed as “noisy intermediate scale quantum” or NISQ. The highlight of the recent announcement is that they have managed to hit the second stage: resilient quantum computing.
As a result, commercially viable quantum computing, which necessitates at least 1,000 reliable, logical qubits, is predicted to arrive within a few years, not decades, as per Svore. Moreover, quantum computing that serves scientific purposes, needing just 100 qubits, is on the horizon.
Obstacles to functional quantum computing primarily include two factors: the number of qubits a single quantum computer possesses and their reliability. This concept parallels traditional computing, where manufacturers continually aim to incorporate more gates on a single chip while ensuring their reliable operation.
The intertwined challenges of increasing the number of qubits in a quantum computer and enhancing their reliability are inherently tied. Many companies developing quantum computing solutions address the reliability issue by allocating multiple physical qubits for each calculation. Essentially, such a strategy enables the conversion of numerous physical qubits – possibly a thousand or more – into a single logical qubit. Qubits are inherently prone to errors, and hence, a large number may be required to achieve a single dependable logical qubit. With each additional qubit posing severe difficulty in construction, this exceptional error rate substantially impedes progress.
In addressing this problem, quantum computing firms tend to focus on both hardware and software aspects. From a hardware perspective, efforts are made to minimise errors from the onset by controlling temperature fluctuations and vibrations, or by creating more stable qubits initially.
A hardware-centric strategy involves the integration of physical error correction capabilities. Earlier this year, three vendors announced significant progress in this area. An innovative example of this approach is the use of backup photons instead of backup qubits. Specifically, microwave-range photons rebounding within hollow mirrored spheres or circuits, each quantum linked to the backed-up qubit, are used in this process.
Quantinuum, a partner of Microsoft, claims that its computers have lower physical error rates and features the smallest-known fault-tolerant circuit compared to others. In September of the previous year, the company made an announcement about achieving a breakthrough in executing mathematics on a fault-tolerant system that uses three logical qubits.
In another announcement made on the 5th of March, the company declared successfully addressing the wiring problem. This is a significant issue in the field of quantum computers since each qubit necessitates multiple control signals, thereby making the addition of more qubits to the system increasingly challenging. However, Quantinuum announced that it has managed to minimize the wiring to one digital input for each qubit, along with a constant number of analog signals, putting scalability within reach at last.
According to Svore, Microsoft’s breakthrough was reliant on this hardware advantage. The new record in reliability was achieved not only because of the innovative error-correction algorithms but also due to the “collaboration between the hardware and software implementations,” she says, “and by implementing physical qubits that provide high fidelities and connectivity.”
This collaboration resulted in the reduction of the total number of physical qubits necessary to form one logical qubit by a factor of up to 800.
They refer to their software error-correction method as the “Carbon code.”
These businesses are not the first ones to implement software-based error correction on quantum computers. A few firms, such as IBM and Alice & Bob, employ low density parity check code, a technology prevalent since the 90s, primarily to augment communication.
According to Svore, Carbon code is unique. “We do not consider the Carbon code to be an LDPC code,” she remarks.
In technical terms, Carbon code is a type of stabilizer code from the Calderbank-Shor-Steane category, which is a quantum error-correcting code. “It encodes two logical qubits among 12 physical qubits,” Svore explains.
In addition, the announcement demonstrates a full error-correction cycle, not just parts of the cycle, as was the case in previous experiments, says Sam Lucero, chief quantum analyst at Omdia. “Fault-tolerant quantum computers are not just a theoretical possibility but have a strong chance to be realized in the real world,” he says.
Some quantum-computing experts doubt that the breakthrough is as big as Microsoft and Quantinuum are painting it to be. “Microsoft itself says it needs to improve the fidelity by at least three orders of magnitude,” says Omdia’s Lucero.
And the experiment only showed Clifford gates, he says. Clifford gates only support some types of computations. This means that the logical qubits Microsoft demonstrated aren’t enough for a full universal computer, he says. “Non-Clifford gate functionality will have to be added at some point.”
And four logical qubits is a ways away from the 100 needed for scientific value, he adds.
On a positive note, current encryption techniques remain secure. Lucero mentions that about 2,000 logical qubits may be required to run Shor’s algorithm to crack AES 256-bit encryption.
Microsoft may have selectively reported results to create a sensational headline, according to David Shaw, the chief analyst at Global Quantum Intelligence. He mentions that they leave out unsuccessful attempts. So, one has to look carefully to see that there was any enhancement in error. It was remarkable, but it required a discerning look.
In simpler terms, we’re at the stage of a rocket engine’s ignition in quantum computing, not quite a Sputnik moment yet, explains Shaw. It’s more of a static burn test than actually launching your work into orbit.
How Microsoft’s methodology could be utilized scalably to minimize errors, or how it could be applied to universal quantum gates, is presently ambiguous.
“Yes, it is a good milestone,” Shaw says. “The debate that’s there in the field is how soon can we build large-scale fault-tolerant systems. This doesn’t really change the debate. A four-logical-qubit machine would have scientific interest, and maybe niche applications, but they’re not likely to be general-purpose, widely usable applications.”
Other companies have built quantum computers with more qubits. But quantum computing today is so nascent that there are multiple and radically different approaches to building the physical qubits.
Microsoft’s approach is heavily dependent on Quantinuum’s quantum-computing hardware, says Baptiste Royer, professor at the University of Sherbrooke, so it’s not immediately likely that other companies will jump on this same technology. “But they might be inspired by the theory behind this,” he says.
According to Royer, the latest announcement is the result of a series of small improvements – in addition to the error-correction codes, there were also improvements in the hardware, in calibration, in fabrication, in precision, as well as new measurement protocols.
Consequently, there aren’t any major, instant advantages for businesses in search of practical quantum computing, notably with the reduced amount of logical qubits.
“This provides a playground for researchers working on error correction,” states Royer. “For these researchers, it’s invigorating. In terms of the common public, I don’t foresee any immediate practical effects – however, it is likely to bring quantum computing nearer and decrease the time needed to reach practical applications.”