A **quantum computer** is a computation device that makes direct use of quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from digital computers based on transistors. Whereas digital computers require data to be encoded into binary digits (bits), quantum computation uses quantum properties to represent data and perform operations on these data. A theoretical model is the quantum Turing machine, also known as the universal quantum computer. Quantum computers share theoretical similarities with non-deterministic and probabilistic computers, like the ability to be in more than one state simultaneously. The field of quantum computing was first introduced by Richard Feynman in 1982. A quantum computer with spins as quantum bits was also formulated for use as a quantum space-time in 1969.

Although quantum computing is still in its infancy, experiments have been carried out in which quantum computational operations were executed on a very small number of qubits (quantum bits). Both practical and theoretical research continues, and many national government and military funding agencies support quantum computing research to develop quantum computers for both civilian and national security purposes, such as cryptanalysis.

Large-scale quantum computers will be able to solve certain problems much faster than any classical computer by using the best currently known algorithms, like integer factorization using Shor's algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, such as Simon's algorithm, which run faster than any possible probabilistic classical algorithm. Given unlimited resources, a classical computer can simulate an arbitrary quantum algorithm so quantum computation does not violate the Churchâ€“Turing thesis. However, the computational basis of 500 qubits, for example, would already be too large to be represented on a classical computer because it would require 2500 complex values to be stored. (For comparison, a terabyte of digital information stores only 243 *discrete* on/off values.) Nielsen and Chuang point out that "Trying to store all these complex numbers would not be possible on any conceivable classical computer."

Read more about Quantum Computer: Basis, Bits Vs. Qubits, Operation, Potential, Developments, Relation To Computational Complexity Theory

### Famous quotes containing the words quantum and/or computer:

“A personality is an indefinite *quantum* of traits which is subject to constant flux, change, and growth from the birth of the individual in the world to his death. A character, on the other hand, is a fixed and definite *quantum* of traits which, though it may be interpreted with slight differences from age to age and actor to actor, is nevertheless in its essentials forever fixed.”

—Hubert C. Heffner (1901–1985)

“The *computer* takes up where psychoanalysis left off. It takes the ideas of a decentered self and makes it more concrete by modeling mind as a multiprocessing machine.”

—Sherry Turkle (b. 1948)