A digestible introduction to how quantum computers work and why they are essential in evolving AI and ML systems. Gain a simple understanding of the quantum principles that power these machines.

Quantum computing is a rapidly accelerating field with the power to revolutionize artificial intelligence (AI) and machine learning (ML). As the demand for bigger, better, and more accurate AI and ML accelerates, standard computers will be pushed to the limits of their capabilities. Rooted in parallelization and able to manage far more complex algorithms, quantum computers will be the key to unlocking the next generation of AI and ML models. This article aims to demystify how quantum computers work by breaking down some of the key principles that enable quantum computing.

A quantum computer is a machine that can perform many tasks in parallel, giving it incredible power to solve very complex problems very quickly. Although traditional computers will continue to serve day-to-day needs of an average person, the rapid processing capabilities of quantum computers has the potential to revolutionize many industries far beyond what is possible using traditional computing tools. With the ability to run millions of simulations simultaneously, quantum computing could be applied to,

**Chemical and biological engineering:**complex simulation capabilities may allow scientists to discover and test new drugs and resources without the time, risk, and expense of in-laboratory experiments.**Financial investing:**market fluctuations are incredibly challenging to predict as they are influenced by a vast amount of compounding factors. The almost infinite possibilities could be modeled by a quantum computer, allowing for more complexity and better accuracy than a standard machine.**Operations and manufacturing:**a given process may have thousands of interdependent steps, which makes optimization problems in manufacturing cumbersome. With so many permutations of possibilities, it takes immense compute to simulate manufacturing processes and often assumptions are required to minimize the range of possibilities to fit within computational limits. The inherent parallelism of quantum computers would enable unconstrained simulations and unlock an unprecedented level of optimization in manufacturing.

Quantum computers rely on the concept of superposition. In quantum mechanics, superposition is the idea of existing in multiple states simultaneously. A condition of superposition is that it cannot be directly observed as the observation itself forces the system to take on a singular state. While in superposition, there is a certain probability of observing any given state.

## Intuitive understanding of superposition

In 1935, in a letter to Albert Einstein, physicist Erwin Schrödinger shared a thought experiment that encapsulates the idea of superposition. In this thought experiment, Schrödinger describes a cat that has been sealed into a container with a radioactive atom that has a 50% chance of decaying and emitting a deadly amount of radiation. Schrödinger explained that until an observer opens the box and looks inside, there is an equal probability that the cat is alive or dead. Before the box is opened an observation is made, the cat can be thought of as existing in both the living *and* dead state simultaneously. The act of opening the box and viewing the cat is what forces it to take on a singular state of dead *or* alive.

## Experimental understanding of superposition

A more tangible experiment that shows superposition was performed by Thomas Young in 1801, though the implication of superposition was not understood until much later. In this experiment a beam of light was aimed at a screen with two slits in it. The expectation was that for each slit, a beam of light would appear on a board placed behind the screen. However, Young observed several peaks of intensified light and troughs of minimized light instead of just the two spots of light. This pattern allowed young to conclude that the photons must be acting as waves when they pass through the slits on the screen. He drew this conclusion because he knew that when two waves intercept each other, if they are both peaking, they add together, and the resulting unified wave is intensified (producing the spots of light). In contrast, when two waves are in opposing positions, they cancel out (producing the dark troughs).

While this conclusion of wave-particle duality persisted, as technology evolved so did the meaning of this experiment. Scientists discovered that even if a single photon is emitted at a time, the wave pattern appears on the back board. This means that the single particle is passing through both slits and acting as two waves that intercept. However, when the photon hits the board and is measured, it appears as an individual photon. The act of measuring the photon’s location has forced it to reunite as a single state rather than existing in the multiple states it was in as it passed through the screen. This experiment illustrates superposition.

## Application of superposition to quantum computers

Standard computers work by manipulating binary digits (bits), which are stored in one of two states, 0 and 1. In contrast, a quantum computer is coded with quantum bits (qubits). Qubits can exist in superposition, so rather than being limited to 0 or 1, they are both a 0 and 1 and many combinations of somewhat 1 and somewhat 0 states. This superposition of states allows quantum computers to process millions of algorithms in parallel.

Qubits are usually constructed of subatomic particles such as photons and electrons, which the double slit experiment confirmed can exist in superposition. Scientists force these subatomic particles into superposition using lasers or microwave beams.

John Davidson explains the advantage of using qubits rather than bits with a simple example. Because everything in a standard computer is made up of 0s and 1s, when a simulation is run on a standard machine, the machine iterates through different sequences of 0s and 1s (i.e. comparing 00000001 to 10000001). Since a qubit exists as both a 0 and 1, there is no need to try different combinations. Instead, a single simulation will consist of all possible combinations of 0s and 1s simultaneously. This inherent parallelism allows quantum computers to process millions of calculations concurrently.

In quantum mechanics, the concept of entanglement describes the tendency for quantum particles to interact with each other and become entangled in a way that they can no longer be described in isolation as the state of one particle is influenced by the state of the other. When two particles become entangled, their states are dependent regardless of their proximity to each other. If the state of one qubit changes, the paired qubit state also instantaneously changes. In awe, Einstein described this distance-independent partnership as “spooky action at a distance.”

Because observing a quantum particle forces it to take on a solitary state, scientists have seen that if a particle in an entangled pair has an upward spin, the partnered particle will have an opposite, downward spin. While it is still not fully understood how or why this happens, the implications have been powerful for quantum computing.

In quantum computing, scientists take advantage of this phenomenon. Spatially designed algorithms work across entangled qubits to speed up calculations drastically. In a standard computer, adding a bit, adds processing power linearly. So if bits are doubled, processing power is doubled. In a quantum computer, adding qubits increases processing power exponentially. So adding a qubit drastically increases computational power.

While entanglement brings a huge advantage to quantum computing, the practical application comes with a severe challenge. As discussed, observing a quantum particle forces it to take on a specific state rather than continuing to exist in superposition. In a quantum system, any outside disturbance (temperature change, vibration, light, etc.) can be thought of as an ‘observation’ that forces a quantum particle to assume a specific state. As particles become increasingly entangled and state-dependent, they are especially prone to outside disturbance impacting the system. This is because a disturbance needs only to effect one qubit to have a spiraling effect on many more entangled qubits. When a qubit is forced into a 0 or 1 state, it loses the information contained at superposition, causing an error before the algorithm can complete. This challenge, called decoherence has prevented quantum computers from being used today. Decoherence is measured as an error rate.

Certain physical error reduction techniques have been used to minimize disturbance from the outside world including keeping quantum computers at freezing temperatures and in vacuum environments but so far, they have not made a meaningful enough difference in quantum error rates. Scientists have also been exploring error-correcting code to fix errors without affecting the information. While Google recently deployed an error-correcting code that resulted in historically low error rates, the loss of information is still too high for quantum computers to be used in practice. Error reduction is currently the major focus for physicists as it is the most significant barrier in practical quantum computing.

Although more work is needed to bring quantum computers to life, it is clear that there are major opportunities to leverage quantum computing to deploy highly complex AI and ML models to enhance a variety of industries.

Happy Learning!

## Sources

Superposition: https://scienceexchange.caltech.edu/topics/quantum-science-explained/quantum-superposition

Entanglement: https://quantum-computing.ibm.com/composer/docs/iqx/guide/entanglement

Quantum computers: https://builtin.com/hardware/quantum-computing