You are currently viewing Google Quantum AI group breakthrough to reduce decoherence

Google Quantum AI group breakthrough to reduce decoherence

“entanglement” phenomenon and decoherence problem

A quantum computer is a computer that exploits quantum mechanical phenomenon. Physical matter exhibits properties of both particles and waves at small scales and quantum computing leverages this behaviour using specialised hardware.

Classical physics can not explain the operation of these quantum devices. The design of quantum algorithms involves creating procedures that allow a quantum computer to perform calculations efficiently.

Google quantum computer. Nature

A scalable quantum computer could perform some calculations exponentially faster than any modern classical computer. The current state of quantum computing is still largely experimental and impractical.

The basic unit of information in quantum computing is the qubit, smaller to the bit in traditional digital electronics. Unlike a classical bit, a qubit can exist in a superposition of its two basis states. It exists in both states simultaneously.

The number of mistakes, a long-standing barrier to the much-hyped technology, has been lowered, according to a recent research, which Google experts claimed on February 22, 2023 marked a significant advancement in their effort to create functional quantum computing.

A machine with abilities much beyond those of the current generation of conventional computers has been created using quantum computing, which has been jailed as a revolutionary development.

The technology s still primarily theoretical though , and there are still many challenging issues to be solved, such as persistently high mistake rates. The Google Quantum AI group recently released new research in the journal of Nature that revealed a method that can dramatically reduce the mistake rates.

This would offer the American tech firm an advantage over competitors utilise bits, which can only be represented by 0 or 1, quantum computers employ qubits, which can simultaneously represent both 0 and 1.

A quantum computer may concurrently calculate an immense number of possible outcomes because of the superposition property. The computers use some of the most astounding features of quantum physics, such as the “entanglement” phenomenon, which allows two bits from a pair to coexist in the same state even though they are separated by great distances.

Yet when the qubits leave their quantum state and come into touch with the outside world, a condition known as decoherence can result in them losing their information.

High mistake rates brought on by this fragility which likewise rise with the number of qubits, frustrate researchers who wish to scale up their studies.

Google Quantum AI group’s system utilising error-correcting coding

Physically engineering high quality qubits has proven challenging. If a physical qubit is not sufficiently isolated from its environment, it suffers from quantum decoherence, introducing noise into calculations.

Two of the most promising technologies are superconductors which isolate an electrical current by eliminating electrical resistance and iron traps which confine a single atomic particle using electromagnetic fields.

A system utilising error-correcting coding, however, may detect and rectify problems without altering the information, according to Google’s team, which claimed to have done so for the first time in practice.

https://quantumai.google/qecmilestone

Although the method was initially proposed in the 1990’s, prior attempts had only resulted in more mistakes, not fewer according to Google’s Harmut Neven, a research co-author.

Neven stated at a press conference that if all system components have sufficientely low error rates, “then the magic of quantum error correction comes in.”

The breakthrough was praised as “a critical scientific milestone” by research co-author Julian Kelly, who added that “quantum error correction is the singly most crucial technique for the future of quantum computing.”

The outcome is still not goof enough to need to get an absolutely low mistake rate. In order to realise the goal of a practical quantum computer, there are more stages to come.

When the tech firm stated its Sycamore machine completed a computation in 200 seconds that would have taken a normal supercomputer 10,000 years to complete, Google claimed in 2019 that it had achieved a milestone known as “quantum supremacy.”

Google’s quantum computing chip dubbed Sycamore, achieved its results using exactly 53 qubits. A 54th one on the chip failed. Sycamore’s aim was to randomly produce strings of 1s and 0s, one digit for each qubit, producing 253 bit strings.

The success has subsequently been contested, with Chinese academics claiming in 2017 that a supercomputer could have outperformed Sycamore’s performance.

Related article

Source: Wikipedia, Google Quantum AI, Scientific American