Quantum Computer

A quantum computer is a computation device that makes direct use of quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from digital computers based on transistors. Whereas digital computers require data to be encoded into binary digits (bits), quantum computation uses quantum properties to represent data and perform operations on these data. A theoretical model is the quantum Turing machine, also known as the universal quantum computer. Quantum computers share theoretical similarities with non-deterministic and probabilistic computers, like the ability to be in more than one state simultaneously. The field of quantum computing was first introduced by Richard Feynman in 1982. A quantum computer with spins as quantum bits was also formulated for use as a quantum space-time in 1969.

Although quantum computing is still in its infancy, experiments have been carried out in which quantum computational operations were executed on a very small number of qubits (quantum bits). Both practical and theoretical research continues, and many national government and military funding agencies support quantum computing research to develop quantum computers for both civilian and national security purposes, such as cryptanalysis.

Large-scale quantum computers will be able to solve certain problems much faster than any classical computer by using the best currently known algorithms, like integer factorization using Shor's algorithm or the simulation of quantum many-body systems. There exist quantum algorithms, such as Simon's algorithm, which run faster than any possible probabilistic classical algorithm. Given unlimited resources, a classical computer can simulate an arbitrary quantum algorithm so quantum computation does not violate the Church–Turing thesis. However, the computational basis of 500 qubits, for example, would already be too large to be represented on a classical computer because it would require 2500 complex values to be stored. (For comparison, a terabyte of digital information stores only 243 discrete on/off values.) Nielsen and Chuang point out that "Trying to store all these complex numbers would not be possible on any conceivable classical computer."

Read more about Quantum Computer:  Basis, Bits Vs. Qubits, Operation, Potential, Developments, Relation To Computational Complexity Theory

Famous quotes containing the words quantum and/or computer:

    But how is one to make a scientist understand that there is something unalterably deranged about differential calculus, quantum theory, or the obscene and so inanely liturgical ordeals of the precession of the equinoxes.
    Antonin Artaud (1896–1948)

    The computer takes up where psychoanalysis left off. It takes the ideas of a decentered self and makes it more concrete by modeling mind as a multiprocessing machine.
    Sherry Turkle (b. 1948)