Entropy in Thermodynamics and Information Theory

Entropy In Thermodynamics And Information Theory

There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in A Mathematical Theory of Communication.

This article explores what links there are between the two concepts, and how far they can be regarded as connected.

Read more about Entropy In Thermodynamics And Information Theory:  Theoretical Relationship, Negentropy, Black Holes, Quantum Theory, The Fluctuation Theorem

Famous quotes containing the words entropy, information and/or theory:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    I was brought up to believe that the only thing worth doing was to add to the sum of accurate information in the world.
    Margaret Mead (1901–1978)

    Lucretius
    Sings his great theory of natural origins and of wise conduct; Plato
    smiling carves dreams, bright cells
    Of incorruptible wax to hive the Greek honey.
    Robinson Jeffers (1887–1962)