Amplitudes are complex numbers, and each possible outcome has a corresponding amplitude. Amplitudes are analogous to conventional probabilities, as the magnitude of the amplitude is correlated to the chance of measuring that outcome. Unlike conventional probabilities, amplitudes have phase and can interfere with each other.
Certain quantum operations require, or can be made more efficient using, extra qubits that do not store the inputs or outputs of the operation. Since these extra qubits do not contain useful information before or after the operation, their role is auxiliary. If the state of the auxiliary qubits is known before the operation, they are known as ‘clean’ qubits (and are usually set to ). If the state is unknown, they are referred to as ‘dirty’ qubits.
The term backend can refer to either a quantum system or a high-performance classical simulator of a quantum system. At our Compute resources page we list all current IBM Quantum systems and simulators.
is a Subsystem error correcting code. In a Subsystem code, information is encoded in a subsystem of a Hilbert space. Subsystem codes lend to simplified error correcting procedures unlike codes which encode information in the subspace of a Hilbert space. This simplicity led to the first demonstration of fault tolerant circuits on a quantum computer.
The Bloch sphere (named after Felix Bloch) is a visual representation of the state of a qubit. Note that the Bloch sphere is different from the q-sphere; for more details, see the q-sphere visualization topic in the IBM Quantum Composer docs. Multiple states can also be simultaneously displayed (see below). The components of the Bloch representation of the qubit state are found from the expectation values of the , , and gates. A qubit described by a statevector has unit length and is found on the surface of the Bloch sphere. Qubits characterized by a density matrix will in general have length less than one, as determined by the purity of the state, and lie within the Bloch sphere.

In computational complexity theory, bounded-error quantum polynomial time (BQP) is the class of decision problems solvable by a quantum computer in polynomial time, with an error probability of at most 1/3 for all instances.[4] It is the quantum analogue to the complexity class BPP. A decision problem is a member of BQP if there exists a quantum algorithm (an algorithm that runs on a quantum computer) that solves the decision problem with high probability and is guaranteed to run in polynomial time. A run of the algorithm will correctly solve the decision problem with a probability of at least 2/3.
CLOPS, or circuit layer operations per second, is a measure of how many layers of a QV circuit a QPU (quantum processing unit) can execute per unit of time. Find more information about this metric in the paper called Quality, Speed, and Scale: three key attributes to measure the performance of near-term quantum computers.
is a protocol for predicting functions of a quantum state using only a logarithmic number of measurements. Given an unknown state , a tomographically complete set of gates (e.g Clifford gates), a set of observables and a quantum channel (defined by randomly sampling from , applying it to and measuring the resulting state); predict the expectation values . A list of classical shadows is created using , and by running a Shadow generation algorithm. When predicting the properties of , a Median-of-means estimation algorithm is used to deal with the outliers in . Classical shadow is useful for direct fidelity estimation, entanglement verification, estimating correlation functions, and predicting entanglement entropy.
is the invocation of quantum emulators, simulators or processors through the cloud. Increasingly, cloud services are being looked on as the method for providing access to quantum processing. Quantum computers achieve their massive computing power by initiating quantum physics into processing power and when users are allowed access to these quantum-powered computers through the internet it is known as quantum computing within the cloud.
The coherence of a qubit, roughly speaking, is its ability to maintain superposition over time. It is therefore the absence of “decoherence”, which is any process that collapses the quantum state into a classical state, for instance by interaction with an environment.
(also referred to as XEB), is quantum benchmarking protocol which can be used to demonstrate quantum supremacy. In XEB, a random quantum circuit is executed on a quantum computer multiple times in order to collect a set of samples in the form of bitstrings . The bitstrings are then used to calculate the cross-entropy benchmark fidelity ( ) via a classical computer, given by ,
where is the number of qubits in the circuit and is the probability of a bitstring for an ideal quantum circuit . If , the samples were collected from a noiseless quantum computer. If , then the samples could have been obtained via random guessing. This means that if a quantum computer did generate those samples, then the quantum computer is too noisy and thus has no chance of performing beyond-classical computations. Since it takes an exponential amount of resources to classically simulate a quantum circuit, there comes a point when the biggest supercomputer that runs the best classical algorithm for simulating quantum circuits can't compute the XEB. Crossing this point is known as achieving quantum supremacy; and after entering the quantum supremacy regime, XEB can only be estimated.
The DiVincenzo criteria are a list of conditions that are necessary to construct a quantum computer, and they were first proposed by the theoretical physicist David P. DiVincenzo in his 2000 paper “The Physical Implementation of Quantum Computation”. The DiVincenzo criteria consist of 5+2 conditions that an experimental setup must satisfy in order to successfully implement quantum algorithms, such as Grover’s search algorithm, or Shor factorisation. The two additional conditions are necessary to implement quantum communication, such as that used in the quantum key distribution.
is a no-go theorem that states: "No quantum error correcting code can have a continuous symmetry which acts transversely on physical qubits". In other words, no quantum error correcting code can transversely implement a universal gate set. Since quantum computers are inherently noisy, quantum error correcting codes are used to correct errors that affect information due to decoherence. Decoding error corrected data in order to perform gates on the qubits makes it prone to errors. Fault tolerant quantum computation avoids this by performing gates on encoded data. Transversal gates, which perform a gate between two "logical" qubits each of which is encoded in N "physical qubits" by pairing up the physical qubits of each encoded qubit ("code block"), and performing independent gates on each pair, can be used to perform fault tolerant but not universal quantum computation because they guarantee that errors don't spread uncontrollably through the computation. This is because transversal gates ensure that each qubit in a code block is acted on by at most a single physical gate and each code block is corrected independently when an error occurs. Due to the Eastin–Knill theorem, a universal set like {H, S, CNOT, T } gates can't be implemented transversally. For example, the T gate can't be implemented transversely in the Steane code. This calls for ways of circumventing Eastin–Knill in order to perform fault tolerant quantum computation. In addition to investigating fault tolerant quantum computation, the Eastin–Knill theorem is also useful for studying quantum gravity via the AdS/CFT correspondence and in condensed matter physics via quantum reference frame or many-body theory.
Quantum entanglement is a special connection between two qubits. There are many ways of generating entanglement. One way is to bring two qubits close together, to perform an operation to entangle them and then move them apart again. When they are entangled, you can move them arbitrarily far away from each other and they will remain entangled. This entanglement will manifest itself in the outcomes of measurements on these qubits. When measured, these qubits will always yield zero or one randomly, but no matter how far away they are from each other, they will always yield the same outcome. Entanglement has two very special properties that allow all the applications derived from it to be made: the first property is that entanglement cannot be shared. If two qubits are maximally entangled with each other, then no other party in the universe can have a share of this entanglement. This property is called the monogamy of entanglement. The second property of entanglement, which makes it so powerful, is called maximal coordination. This property manifests itself when measuring the qubits. When two qubits that are entangled are measured in the same basis, no matter how far away they are from each other, they will always yield the same outcome. This outcome is not decided beforehand, but is completely random and decided on when the measurement takes place.
Fair-share queuing executes jobs on a quantum system in a dynamic order so that no provider can monopolize the system. The shares in fair-share queuing represent the fraction of system time that is allocated to a given provider. Providers with the most device time have the highest priority in the fair-share algorithm. A provider’s dynamic priority depends on how much of the provider’s allotted system time has been consumed over a given floating window of time. When you send a job, it will be executed by the provider with the highest dynamic priority (or lowest fraction of allotted time used) at that moment. For more information, see the Fair-share queuing section.
is the smallest quantum error correcting code that can protect a logical qubit from any arbitrary single qubit error. In this code, 5 physical qubits are used to encode the logical qubit. With and being Pauli matrices and the Identity matrix, this code's generators are . Its logical operators are and . Once the logical qubit is encoded, errors on the physical qubits can be detected via stabilizer measurements. A lookup table that maps the results of the stabilizer measurements to the types and locations of the errors gives the control system of the quantum computer enough information to correct errors.
A job ties together all of the relevant information about a computation on IBM Quantum: a quantum circuit, choice of backend, the choice of how many shots to execute on the backend, and the results upon executing the quantum circuit on the backend.
A phase applied to a statevector as a whole, . States related by a global phase are equivalent in quantum mechanics; global phases can be ignored. This is a consequence of the fact that only energy differences, as opposed to absolute values, matter in determining the dynamics of physical systems. See An aside on global phase in the Field Guide, found in the IBM Quantum Composer docs.
Hadamard test (quantum_computation) is a method used to create a random variable whose expected value is the expected real part , where is a quantum state and is a unitary gate acting on the space of . The Hadamard test produces a random variable whose image is in and whose expected value is exactly . It is possible to modify the circuit to produce a random variable whose expected value is
Magic state distillation is a process that takes in multiple noisy quantum states and outputs a smaller number of more reliable quantum states. It is considered by many experts to be one of the leading proposals for achieving fault tolerant quantum computation. Magic state distillation has also been used to argue that quantum contextuality may be the "magic ingredient" responsible for the power of quantum computers.
Mølmer–Sørensen gate (or MS gate), is a two qubit gate used in trapped ion quantum computing. It was proposed by Klaus Mølmer and Anders Sørensen. Their proposal also extends to gates on more than two qubits.
Elementary particles, (the “fermions”) which form the matter, are described by an equation formulated in 1928 by Paul Dirac, the Dirac Equation. It implies that every fundamental particle in the universe has an antiparticle, which has the same mass but the opposite charge. In 1932 was found the first antiparticle: the positron, associated with the electron. The electron and the other elementary particles have distinct antiparticles and they acquire mass through Higgs mechanism: in physics they are called “Dirac fermions”. In 1937, the Italian physicist Ettore Majorana found out a more general equation (Majorana Equation) that predicts the existence of neutral fermions (without electric charge) that are their own antiparticles. Majorana fermions are exotic particles because they acquire the mass, not through Higgs mechanism, but interacting with themselves, because they are their own antiparticles. This kind of interaction happens without annihilation, because Majorana fermions are very stable and interact very little with “ordinary” matter.
Measurement is the act of observing a quantum state. This observation will yield classical information, such as a bit. It is important to note that this measurement process will change the quantum state. For instance, if the state is in superposition, this measurement will ‘collapse’ it into a classical state: zero or one. This collapse process takes place randomly. Before a measurement is done, there is no way of knowing what the outcome will be. However, it is possible to calculate the probability of each outcome. This probability is a prediction about the quantum state, a prediction that we can test by preparing the state several times, measuring it and then counting the fraction of each outcome.
The no-cloning principle is a fundamental property of quantum mechanics which states that, given an unknown quantum state, there is no reliable way of producing extra copies of that state. This means that information encoded in quantum states is essentially unique. This is sometimes very annoying, such as when you want to protect quantum information from outside influences, but it is also sometimes very useful, such as when you want to communicate securely with someone else.
A quantum assembly language dialect (see QASM). For more information, see the Build your circuit with OpenQASM code topic in the IBM Quantum Composer docs.
Quantum circuit format generated by the Qiskit program. OpenQASM is the low-level language consumed by the Quantum Processing Unit (QPU).
Access to the various services offered by IBM Quantum is controlled by the providers to which you are assigned. A provider is defined by a hierarchical organization of hub, group, and project. A hub is the top level of a given hierarchy (organization) and contains within it one or more groups. These groups are in turn populated with projects. The combination of hub/group/project is called a provider. Users can belong to more than one provider at any given time.
QASM is an abbreviation for quantum assembly language. It is a set of text-based instructions to describe and visualize quantum circuits. IBM Quantum uses a dialect called OpenQASM; see more in the Build your circuit with OpenQASM code topic in the IBM Quantum Composer docs.
For a given problem, the improvement in run time for a quantum computer versus a conventional computer running the best known conventional algorithm.
Quantum algorithm is an algorithm which runs on a realistic model of quantum computation, the most commonly used model being the quantum circuit model of computation. A classical (or non-quantum) algorithm is a finite sequence of instructions, or a step-by-step procedure for solving a problem, where each step or instruction can be performed on a classical computer. Similarly, a quantum algorithm is a step-by-step procedure, where each of the steps can be performed on a quantum computer. Although all classical algorithms can also be performed on a quantum computer, the term quantum algorithm is usually used for those algorithms which seem inherently quantum, or use some essential feature of quantum computation such as quantum superposition or quantum entanglement.
Quantum computing is a type of computation whose operations can harness the phenomena of quantum mechanics, such as superposition, interference, and entanglement. Devices that perform quantum computations are known as quantum computers.Though current quantum computers are too small to outperform usual (classical) computers for practical applications, larger realizations are believed to be capable of solving certain computational problems, such as integer factorization (which underlies RSA encryption), substantially faster than classical computers. The study of quantum computing is a subfield of quantum information science.
Quantum dots are effectively “artificial atoms.” They are nanocrystals of semiconductor wherein an electron-hole pair can be trapped. The nanometer size is comparable to the wavelength of light and so, just like in an atom, the electron can occupy discrete energy levels. The dots can be confined in a photonic crystal cavity, where they can be probed with laser light.
Quantum computers are always in contact with the environment. This environment can disturb the computational state of the system, thereby causing information loss. Quantum error correction combats this loss by taking the computational state of the system and spreading it out over an entangled state over many qubits. This entanglement allows outside classical observers to observe and remedy disturbances without observing the computational state itself, which would collapse it.
Researchers at QuTech in the Netherlands are trying to build the world’s first quantum internet. Quantum Internet is like the regular Internet but it can send quantum states and establish entanglement. Building a full-scale quantum internet is of course very hard. So they will begin by establishing a small four-node network by 2020. This four-node network would serve as a testbed for the larger network. The four nodes in this network will be four cities in the Netherlands: Delft, Amsterdam, Leiden and The Hague.
The fundamental condition of existence, supported by all empirical evidence, in which an isolated quantum system, such as a free electron, does not possess fixed properties until observed in experiments designed to measure those properties. That is, a particle does not have a specific mass, or position, or velocity, or spin, until those properties are measured. Indeed, in a strict sense the particle does not exist until observed.
Quantum Logic Gates are analogous to conventional electronic logic gates in conventional computers but different in that the system follows the strange rules of quantum mechanics. An early realization of a quantum logic gate used a single trapped beryllium ion to demonstrate a two-bit quantum logic gate. One qubit, the control qubit, is specified by the (quantized) external vibrations of the ion in the atom trap; the two lowest vibrational levels correspond to values of 0 and 1. The other qubit (the target qubit) is specified by an internal state of one of the ion’s electrons; it has a “spin-down” state (0) and a “spin-up” state (1). Shooting laser pulses at a single ion causes it to act as a two-bit “controlled NOT” gate. If the control qubit is 0, the target bit is left alone. If the control qubit is 1, the target bit flips its spin.
Quantum key distribution (QKD) is a method that leverages the properties of quantum mechanics, such as the no cloning theorem, to allow two people to securely agree on a key (OTP – One Time Pad). A key in this context is a secret code-word that is shared only between you and the person you are trying to communicate with. This secret code-word can then be used to encrypt messages such that they can be transmitted without being read by a malicious third party.
Quantum repeaters enable long distance communication over a quantum network. An optical fiber can transmit a qubit over roughly 100 kilometers. If you want to send a quantum information over an very long distance just a fiber is not good enough. To send information over this long distances we need quantum repeaters. Quantum repeaters can be thought of as a series of short entangled links connecting the two points. The quantum information can then be teleported through these links and arrive safely at its destination.
A quantum sensor is a device that exploits quantum correlations, such as quantum entanglement, to achieve a sensitivity or resolution that is better than can be achieved using only classical systems. A quantum sensor can measure the effect of the quantum state of another system on itself. The mere act of measurement influences the quantum state and alters the probability and uncertainty associated with its state during measurement. Quantum sensor is also a term used in other settings where entangled quantum systems are exploited to make better atomic clocks or more sensitive magnetometers. If you have a super sensitive detector, its killer app is surely in measuring the smallest effects you could possibly imagine. This might mean be the tiny disturbances in space as a gravitational wave goes by; or a small change in a magnetic field, perhaps that of the Earth itself; or even overcoming the shortcomings of conventional radar systems, to build a quantum radar for detecting stealth planes.
Quantum Tunnelling is the quantum mechanical effect in which particles have a finite probability of crossing an energy barrier, or transitioning through an energy state normally forbidden to them by classical physics, due to the wave-like aspect of particles. The probability wave of a particle represents the probability of finding the particle in a certain location, and there is a finite probability that the particle is located on the other side of the barrier.
Quantum simulation, which originated to a great extent with Richard Feynman’s 1982 proposal, has evolved into a field where scientists use a controllable quantum system to study a second, less experimentally feasible quantum phenomenon. In short, a full-scale quantum computer does not yet exist, and classical computers often cannot solve quantum problems, thus a “quantum simulator” presents an attractive alternative to gain insight into, for example, complex material properties.
A calculation on a quantum computer that cannot be in practice be performed on any foreseeable conventional computer. Either the number of CPU steps required or the necessary computer memory increases exponentially with the size of the input. This means that for all but the simplest cases, the calculation becomes unfeasible on a real machine using only conventional digital hardware.
Qiskit Python source code that describes the problem to solve, combining classical and quantum computation. It takes inputs from the users, makes the calculation, and returns the results. Each instance of a Qiskit program execution is similar to a computer process and is represented as a runtime job.
The data that defines your program. Critical components are name, maximum execution time, version, backend requirements, and input/output parameters.
Qiskit Runtime is a cloud service that runs the Qiskit program remotely as a process, passing the input from the user, and handling the connectivity between the Qiskit program, the user, and the quantum processing unit. You can repeat this multiple times with the same or different Qiskit programs.
The main task of this component of the architecture is to prepare Qiskit Runtime, load the Qiskit programs, and supervise the correct execution.
A quantum computer is a device capable of executing coherent controlled quantum dynamics.
A quantum gate is a reversible (unitary) operation applied to one or more qubits.
Part of the computational unit that performs the quantum computation.
Quantum volume is a metric that measures the capabilities and error rates of a quantum computer. It expresses the maximum size of square quantum circuits that can be implemented successfully by the computer. The form of the circuits is independent from the quantum computer architecture, but compiler can transform and optimize it to take advantage of the computer's features. Thus, quantum volumes for different architectures can be compared.
(QEC), is used in quantum computing to protect quantum information from errors due to decoherence and other quantum noise. Quantum error correction is theorised as essential to achieve fault-tolerant quantum computation that can reduce the effects of noise on stored quantum information, faulty quantum gates, faulty quantum preparation, and faulty measurements.
(QIMP), is using quantum computing or quantum information processing to create and work with quantum images. Due to some of the properties inherent to quantum computation, notably entanglement and parallelism, it is hoped that QIMP technologies will offer capabilities and performances that surpass their traditional equivalents, in terms of computing speed, security, and minimum storage requirements.
Quantum programming is the process of assembling sequences of instructions, called quantum programs, that are capable of running on a quantum computer. Quantum programming languages help express quantum algorithms using high-level constructs. The field is deeply rooted in the open-source philosophy and as a result most of the quantum software discussed in this article is freely available as open-source software.
Quantum simulators permit the study of quantum system in a programmable fashion. In this instance, simulators are special purpose devices designed to provide insight about specific physics problems. Quantum simulators may be contrasted with generally programmable "digital" quantum computers, which would be capable of solving a wider class of quantum problems.
In quantum information science, quantum state discrimination refers to the task of inferring the quantum state that produced the observed measurement probabilities. More precisely, in its standard formulation, the problem involves performing some POVM on a given unknown state , under the promise that the state received is an element of a collection of states , with occurring with probability , that is, . The task is then to find the probability of the POVM correctly guessing which state was received. Since the probability of the POVM returning the -th outcome when the given state was has the form , it follows that the probability of successfully determining the correct state is .
Quantum supremacy or quantum advantage, is the goal of demonstrating that a programmable quantum device can solve a problem that no classical computer can solve in any feasible amount of time (irrespective of the usefulness of the problem). Conceptually, quantum supremacy involves both the engineering task of building a powerful quantum computer and the computational-complexity-theoretic task of finding a problem that can be solved by that quantum computer and has a superpolynomial speedup over the best known or possible classical algorithm for that task. The term was coined by John Preskill in 2012, but the concept of a quantum computational advantage, specifically for simulating quantum systems, dates back to Yuri Manin's (1980) and Richard Feynman's (1981) proposals of quantum computing. Examples of proposals to demonstrate quantum supremacy include the boson sampling proposal of Aaronson and Arkhipov, D-Wave's specialized frustrated cluster loop problems, and sampling the output of random quantum circuits.
(QTM), or universal quantum computer, is an abstract machine used to model the effects of a quantum computer. It provides a simple model that captures all of the power of quantum computation—that is, any quantum algorithm can be expressed formally as a particular quantum Turing machine. However, the computationally equivalent quantum circuit is a more common model.
A qubit (/ˈkjuːbɪt/) or quantum bit is a basic unit of quantum information—the quantum version of the classic binary bit physically realized with a two-state device. A qubit is a two-state (or two-level) quantum-mechanical system, one of the simplest quantum systems displaying the peculiarity of quantum mechanics. Examples include the spin of the electron in which the two levels can be taken as spin up and spin down; or the polarization of a single photon in which the two states can be taken to be the vertical polarization and the horizontal polarization. In a classical system, a bit would have to be in one state or the other. However, quantum mechanics allows the qubit to be in a coherent superposition of both states simultaneously, a property that is fundamental to quantum mechanics and quantum computing.
Quil is a quantum instruction set architecture that first introduced a shared quantum/classical memory model. It was introduced by Robert Smith, Michael Curtis, and William Zeng in A Practical Quantum Instruction Set Architecture. Many quantum algorithms (including quantum teleportation, quantum error correction, simulation,and optimization algorithms) require a shared memory architecture. Quil is being developed for the superconducting quantum processors developed by Rigetti Computing through the Forest quantum programming API. A Python library called pyQuil was introduced to develop Quil programs with higher level constructs. A Quil backend is also supported by other quantum programming environments.
Qutrit (or quantum trit), is a unit of quantum information that is realized by a 3-level quantum system, that may be in a superposition of three mutually orthogonal quantum states. The qutrit is analogous to the classical radix-3 trit, just as the qubit, a quantum system described by a superposition of two orthogonal states, is analogous to the classical radix-2 bit. There is ongoing work to develop quantum computers using qutrits and qubits with multiple states.
A quantum register is a collection of qubits on which gates and other operations act. A classical register consists of bits that can be written to and read within the coherence time of the quantum circuit.
A phase difference between components of a superposition state. By convention, the first term in a superposition is made to be real, and the remaining states have phase values relative to this, e.g.,
A seed is the value introduced into the algorithm that generates pseudorandom numbers. The simulator creates randomness by generating results based on the seed.
Because the measurement of a qubit in a superposition state is random — the outcome is sometimes 0 and sometimes 1 — you must repeat the measurement multiple times to determine the likelihood that a qubit is in a particular state. When performing the experiment, you will be asked how many shots, or executions, to run in order to determine the qubit state probabilities.
For a quantum computer comprised of a small number of qubits , we can simulate its behavior on a classical computer. In general such a computation requires storing complex numbers, where is the number of qubits. For circuits composed solely of Clifford gates, or circuits generating quantum states that are weakly entangled, special simulation techniques allow for simulating a greater number of qubits. See the Simulators overview topic to learn about the IBM Quantum simulators.
Any single realization of a quantum system can be described through a complex vector known as its statevector. In a gate-based quantum computer the state of qubits has elements; the dimension of the statevector grows exponentially with
Superconducting quantum computing is an implementation of a quantum computer in superconducting electronic circuits. Research in superconducting quantum computing is conducted by IBM, Google, Rigetti Computing, Microsoft and Intel. The devices are typically designed in the radio-frequency spectrum, cooled down in dilution refrigerators below 100mK and addressed with conventional electronic instruments, e.g. frequency synthesizers and spectrum analyzers. The typical dimensions, of a scale of micrometers, with sub-micrometer resolution, allow a convenient design of a quantum Hamiltonian with the well-established integrated circuit technology.
A superposition in quantum mechanics is a weighted sum, or linear combination, of two or more quantum states. A quantum computer with qubits can exist in a superposition of all of its computational basis states , through . Exploiting this ability is fundamental to most quantum algorithms.
In quantum information and computation, the Solovay–Kitaev theorem says, roughly, that if a set of single-qubit quantum gates generates a dense subset of SU(2) then that set is guaranteed to fill SU(2) quickly, which means any desired gate can be approximated by a fairly short sequence of gates from the generating set. Robert M. Solovay initially announced the result on an email list in 1995, and Alexei Kitaev independently gave an outline of its proof in 1997. Solovay also gave a talk on his result at MSRI in 2000 but it was interrupted by a fire alarm. Christopher M. Dawson and Michael Nielsen call the theorem one of the most important fundamental results in the field of quantum computation.
Quantum teleportation is a method to send qubits using entanglement. Teleportation works as follows: first Alice and Bob need to establish an entangled pair of qubits between them. Alice then takes the qubit that she wants to send and the qubit that is entangled with Bob’s qubit and performs a measurement on them. This measurement collapses the qubits and destroys the entanglement, but gives her two classical outcomes in the form of two classical bits. Alice takes this two classical bits and sends them over the classical Internet to Bob. Bob then applies a correction operation that depends on these two classical bits to his qubit. This allows him to recover the qubit that was originally in Alice’s possession. Note that we have now transmitted a qubit without really using a physical carrier that is capable of transmitting qubits. But of course you already need entanglement to do this. It is also important to note that quantum teleportation does not allow for faster than light communication. This is so because Bob cannot make sense of the qubit in her possession before he gets the classical measurement outcomes from Alice. These classical measurement outcomes must take a certain amount of time to be transmitted. And this time is lower bounded by the speed of light.
Transpilation is the process where a quantum circuit is transformed into a new quantum circuit that performs the same task, but is restructured to be compatible with the physical layout of a particular quantum system and, where possible, optimize its performance.
A topological quantum computer is a theoretical quantum computer that employs two-dimensional quasiparticles called anyons, whose world lines pass around one another to form braids in a three-dimensional spacetime (i.e., one temporal plus two spatial dimensions). These braids form the logic gates that make up the computer. The advantage of a quantum computer based on quantum braids over using trapped quantum particles is that the former is much more stable. Small, cumulative perturbations can cause quantum states to decohere and introduce errors in the computation, but such small perturbations do not change the braids’ topological properties. This is like the effort required to cut a string and reattach the ends to form a different braid, as opposed to a ball (representing an ordinary quantum particle in four-dimensional spacetime) bumping into a wall. Alexei Kitaev proposed topological quantum computation in 1997.
In quantum physics, we cannot simultaneously know two non-commuting variables (like the position and momentum of a particle). This implies that a quantum system in a perfectly definite state can be certain under one measurement and completely random under another. Moreover, if a quantum system starts out in an arbitrary unknown state, no measurement can reveal complete information about that state; the more information the measurement reveals, the more the state is disturbed. This is a underlying principle of quantum cryptography.
A universal fault-tolerant quantum computer is the grand challenge of quantum computing. It is a device that can properly perform universal quantum operations using unreliable components. See also universal quantum computer.
A universal quantum computer is a machine that can simulate an arbitrary quantum state from an arbitrary initial quantum state. See also universal fault-tolerant quantum computer.
A Quantum Turing machine (QTM), also a universal quantum computer, is an abstract machine used to model the effect of a quantum computer. It provides a very simple model which captures all of the power of quantum computation. Any quantum algorithm can be expressed formally as a particular quantum Turing machine. Such Turing machines were first proposed in a 1985 article written by Oxford University physicist David Deutsch suggesting quantum gates could function in a similar fashion to traditional digital computing binary logic gates. Quantum Turing machines are not always used for analyzing quantum computation; the quantum circuit is a more common model. These models are computationally equivalent.