Closer to a dream than a reality a few years ago, the quantum computer is now becoming more and more concrete. However, its construction still faces many challenges, starting with the fragility of the information encoded on a quantum computer, leading to all sorts of errors in calculations.
While classical computers manipulate conventional binary values of 0 or 1 to perform calculations, quantum computers use so-called "qubits" (or "quantum bits"), capable of interacting much more efficiently than bits. These properties lead in principle to exponential acceleration for certain computational tasks, but the unreliability of these qubits remains a major obstacle today.
Qubits are very noisy. The order of magnitude is that one operation out of 1000 will be wrong. Knowing that some quantum calculations require 100 billion operations, with this error rate, the result is bound to be wrong at the end.
Protecting the quantum computer from noise to reduce computing time
From the end of the nineties, researchers started to think about how to reduce the possible errors in quantum computations. Among the solutions found: error-correcting codes, which offer the possibility to correct errors by exploiting redundancy: "The error-correcting code will spread the information over a large number of qubits, in a redundant way, so that if there is a little "noise", we can correct the errors and thus protect the information useful to the calculation," Anthony Leverrier details, before adding, "This is something that does not really exist in classical computing because the transistors are very reliable, but which is necessary in quantum. "
Based on this, the challenge for the researchers was to find the best quantum corrector codes to implement, capable of reducing the number of redundant qubits needed to protect the information. For example, the quantum factorization algorithm (which allows encrypted communications to be read, for example) requires only a few thousand qubits in theory, but this number rises to several tens of millions when redundant qubits are included if known quantum corrector codes are used. This is a fundamental problem in the development of the quantum computer.
The quality of a corrector code is measured by its distance, i.e. the number of errors it can correct. After more than twenty years without any progress on this issue, the first codes with a distance better than the square root of the length were discovered in 2020, and it is only very recently, in November 2021, that Russian researchers Pavel Panteleev and Gleb Kalachev managed to show that, at least in theory, quantum information can be protected from errors as well as classical information. A revolutionary work, rewarded by the Best Paper Award at the 54th Symposium on Theory of Computing (STOC 2022).
A discovery followed, last March, by the publication by French researchers Anthony Leverrier (Inria - COSMIQ project-team) and Gilles Zémor (Institut Mathématique de Bordeaux) of a different (but related) construction of good quantum LDPC codes: Quantum Tanner codes. This family of codes, conceptually much simpler to analyze than the Panteleev and Kalachev codes, has been the main new tool leading to the resolution of the NLTS conjecture in quantum complexity theory. A result that has aroused a lot of interest from the community, and that will be presented on November 3rd at FOCS 2022.
The 63rd Annual Foundations of Computing Symposium (FOCS 2022), sponsored by the IEEE Computer Society Technical Committee on Mathematical Foundations of Computing, will be held in Denver, Colorado, October 31-November 3, 2022.
Facilitating the transition to large-scale quantum computing
But to be useful for the construction of a quantum computer, it is not enough to have good quantum codes: one must also be able to correct errors efficiently, in real time. Last June, three papers showed how these recent quantum LDPC codes can be decoded in linear time, including one signed by Anthony Leverrier and Gilles Zémor, which will be presented in January at SODA 2023.
While this is already remarkable, it is still probably far too slow to be useful for a large-scale quantum computer since errors keep accumulating during the decoding process.
This is a problem that the two French researchers addressed in a paper published last August, presenting a parallel decoder that corrects arbitrary errors of linear weight in logarithmic time. "This was an open question in the Russian researchers' paper, and with Gilles we have shown how to decode these new codes, in a fast way," concludes Anthony Leverrier.
These results place France among the most advanced nations in the field of quantum computing. The French government is closely monitoring and supporting this field. In January 2021, France adopted a "national quantum strategy". The PEPR Quantum Technologies program, with a budget of 150 million euros and led by the CEA, the CNRS and Inria, is the upstream research component of this strategy. Anthony Leverrier is working on one of the ten PEPR projects, NISQ2LSQ, which aims to improve and develop new error correction strategies needed to implement fault-tolerant quantum computers.