Qubit Contenders

Benjamin Skuse

Quantum computers have been ‘about a decade away’ from solving real-world practical problems better than modern supercomputers for, well, a lot longer than a decade now. Their promise famously lies in the qubit, the quantum version of the traditional bit. Where a common bit can represent either a 0 or a 1, a qubit takes advantage of the quantum phenomenon of superposition to be able to represent a 0, a 1 or a state where it is any proportion of both 0 and 1 simultaneously – opening the door to a whole new world of possibilities.

When you have more than one qubit, other quantum mechanical phenomena, particularly interference and entanglement, come into play. A single qubit \(\psi\) can be described by \(\alpha|0\rangle + \beta|1\rangle\) where \(\alpha\) and \(\beta\) are probability amplitudes that represent the probability of being either 0 or 1. Constructive and destructive interference can be used to amplify or cancel out a probability amplitude state. And entanglement can correlate qubits with each other to form a single system, so that measuring the state of one of the qubits lets you know the state of the other without measuring it.

Some quantum algorithms have already been designed to take advantage of these quantum phenomena, where entanglement and interference are utilized on different sets of qubits to perform a computation, after which a measurement is made that collapses each superposition down to a definite 1 or 0, giving an answer.

Finnicky Qubits

But there is just one small problem – by its very nature, a qubit is extremely finnicky. You need to build it from a physical system with two distinguishable configurations that correspond to the computational basis states, \(|0\rangle\) and \(|1\rangle\), and the system must exhibit quantum properties. Once you have managed to fashion one qubit with these qualities, you then need to build many, many more (ideally identical) qubits to provide the kind of processing capacity that can be useful.

On top of that, they must easily interact with one another to allow complex computations, while at the same time remaining robust. Frustratingly though, almost all proposed qubits so far are extremely fragile. When they interact with their environment or strongly interact with one another, they decohere, collapsing into a single reality and transforming into mundane classical bits.

To get anywhere close to reaching the full potential of quantum computers, solutions need to be found to engineer a large number of high-quality, well-connected qubits into a system that can perform accurate computations according to a quantum algorithm and measure the final state of a set of qubits that gives the desired result.

A quantum supercomputer consisting of a million such qubits that can perform one quintillion operations should do it, according to Microsoft. With such a machine, scientific and commercial applications are limited only by the imagination. For example, drug discovery would take days, not years, with the quantum computer being able to quickly simulate complex molecules to identify promising candidates. A quantum supercomputer would also be able to design materials from the atomic level up, imbuing them with extraordinary properties for various purposes, from energy to construction and transportation. It would even be capable of simulating the physics of strongly interacting quantum systems towards building a deeper understanding of how our universe fundamentally works.

The Frontrunner

Google CEO Sundar Pichai with one of Google’s quantum computers in the Santa Barbara lab. Credit: Google AI Quantum.

So is there a qubit out there that Goldilocks would judge to be ‘just right’ for a future quantum supercomputer? One candidate tech behemoths such as IBM and Google are putting their considerable weight behind is the superconducting qubit. Superconducting qubits are very much in IBM and Google’s wheelhouse. Essentially, they are tiny, simple circuits that can be fabricated using processes similar to ones technology companies already use. But these circuits are a little different; they are quantized, meaning that they follow the rules of quantum mechanics to only take on discrete states, and they operate at near absolute zero (i.e. close to -273.15 degrees Celsius) so that their constituent metals become superconducting, conducting current without resistance and thereby avoiding resistive losses and quantum decoherence.

Quantum computers built from superconducting qubits have been at the forefront of the quantum computing revolution for some time. For example, released in 2019, IBM Quantum System One was the first circuit-based commercial quantum computer. Also in 2019, Google’s 53-qubit Sycamore quantum processor grabbed headlines for claiming the experimental realization of ‘quantum supremacy’ on a specific task. In their own words: “Our Sycamore processor takes about 200 seconds to sample one instance of a quantum circuit a million times – our benchmarks currently indicate that the equivalent task for a state-of-the-art classical supercomputer would take approximately 10,000 years.”

IBM and others have since gone on to increase their qubit numbers, with 2022’s IBM Osprey quantum processor featuring 433 superconducting qubits, and more recently their Condor chip housing 1121 qubits. However, a number of challenges are facing further development of quantum computers based on superconducting qubits.

Already, innovative cryogenic methods have been required to overcome mechanical issues associated with cooling the Condor and other chips based on superconducting qubits, and further innovations will be needed in next-generation systems. Moreover, superconducting qubits are sensitive to environmental noise and exhibit high error rates, problems that will only become worse as qubit numbers scale.

Bright minds are now focusing on quality rather than quantity of qubits, and are making progress in advancing quantum error correction and error mitigation, improving processor architectures that significantly reduce errors and developing new cooling methods. But others are looking elsewhere, instead researching whether different types of qubits could outshine superconducting qubits in the long term.

An array of Sycamore chips being prepared for preliminary electrical testing. Credit: Google AI Quantum.
An array of Sycamore chips being prepared for preliminary electrical testing. Credit: Google AI Quantum.

Leading the Charge

Trapped-ion qubits are a leading contender. An ion is a nucleus or molecule that has lost or gained an electron to make it charged. It can then be suspended in free space, or ‘trapped’ using electromagnetic fields. Several ions can be trapped, laser-cooled and placed close together in a one-dimensional array, called an ion chain. This configuration allows the excitations of the ions and the motion of the ion chain to be carefully manipulated with lasers. In particular, the laser light can be wielded to transition an electron in the atom from its ground to an excited state, an ideal pair of qubit basis states. Quantum operations can then be performed using laser or microwave pulses, which can manipulate the internal states of the ions and the interactions between them, including nudging the ions into a state of entanglement.

As trapped-ion qubits are based on ionized atoms, each qubit is identical. And, once prepared in a particular stable quantum state, they remain in that state for very long periods of time (coherence times measured in seconds to minutes as opposed to microseconds for superconducting qubits). Another advantage is that any qubit in the system can be directly entangled with any other qubit. What is more, they exhibit comparatively low error rates and are highly controllable.

For these reasons, research groups across the world take trapped-ion systems as their platform of choice for experimenting with and developing quantum computing technology and conducting quantum simulations. And they are not alone. Companies like IonQ and Quantinuum have already made commercial quantum computers using trapped ions.

Yet these systems seem to have hit a barrier in terms of the sheer number of qubits they can trap and utilize. IonQ’s largest system, Forte, boasts 36 physical qubits and, just this year, Quantinuum announced the launch of its 56-qubit System Model H2. Scaling to 100 or more qubits may prove to be a bridge too far. This is because when the ion chain contains many ions, coherence time and readout accuracy suffer. Longer ion chains have shorter coherence times, with 100+ ion chains suffering significant effects of decoherence. At the same time, it becomes harder to read individual ions with high accuracy as chains grow longer because laser pulse precision beyond current technological capabilities is needed to do so.

IonQ optical system. Credit: IonQ.
Ion trap encased in chamber. Credit: Quantinuum.

Light Alternative

If these hurdles cannot be overcome, photonic qubits are also a serious option. Using the photon as a qubit, the primary advantage of photonic qubits is that they are naturally resilient to types of noise that cause errors and decoherence in other platforms, because they interact much less with their environment. This means that, where superconducting qubits and trapped-ion qubits require complicated cooling setups, quantum computers that depend on photonic qubits can, in principle, operate at room temperature.

They are also very flexible in terms of how they encode quantum information. For example, photonic qubits can be generated using quantum dots in nanophotonic waveguide circuits, or by utilizing the polarization of photons or their paths of travel. Moreover, advances in the fabrication of integrated optical components mean such quantum computers could fit on a single chip, and also readily integrate into existing fibre-optic based telecommunications systems, paving the way for secure quantum communication.

There is just one major problem: Photons not only interact very little with their environment, they also do not interact much with each other. This means they are easily lost and difficult to control, making it challenging to perform complex quantum operations.

Among others, tackling this challenge head-on are the University of Science and Technology of China (USTC) and the Canadian company Xanadu. USTC’s Jiuzhang 3.0 and Xanadu’s Borealis can both solve Gaussian boson sampling problems, a mathematical model suitable for quantum computation, in microseconds, as opposed to the many thousands of years a traditional supercomputer would require to accomplish the same task. This quantum supremacy is however restricted to Gaussian boson sampling. To encode interesting problems reflective of real-world applications in a photonic quantum computer will require significant further development.

Xanadu lab. Credit: Xanadu.

Exotic Options and Improvements

These qubit contenders are just the tip of the iceberg. More than an honourable mention should also go to nitrogen-vacancy centres in diamond lattices (imperfections in diamond crystals to create qubits) and neutral-atoms, both of which show exciting promise as qubits.

Furthermore, topological qubits continue to be regarded by some as the panacea of quantum computing. Microsoft, for example, has long seen topological qubits as the path to follow. Encoding quantum information in the topological phase of matter of a system instead of in the properties of individual particles or atoms, this approach delivers a layer of abstraction that protects topological qubits from noise. This provides the potential to reduce errors drastically while allowing a given quantum system to scale.  

Recently, both Google and Quantinuum made a breakthrough in this direction, announcing the creation of a new breed of topological quasiparticle called the non-Abelian anyon, discovered using their superconducting qubit and trapped-ion quantum computers, respectively. The hope is that this new quasiparticle may be the missing piece needed for error-free computation from scaled quantum computers based on these approaches.

Another ingenious approach to protecting qubits from errors is to spread and encode information over a collection of physical qubits that form a single ‘logical qubit’. This provides a way to perform reliable quantum computations even when noise and errors affect individual physical qubits, as the information is abstracted to the collective logical qubit. Error-corrected logical qubits have been demonstrated with superconducting, trapped-ion, and neutral atom qubits.

Do these breakthroughs mean quantum computers are now on the verge of solving real-world practical problems better than modern supercomputers? Well, the answer remains no. Each approach to quantum computing still has significant hurdles to overcome. But with the prize ultimately being extraordinary insights into the universe and discoveries beyond human capabilities, progress on so many different fronts continues at a terrifying pace – quantum computing has never been closer to finally benefiting the world.

The post Qubit Contenders originally appeared on the HLFF SciLogs blog.