Entropy in Thermodynamics and Information Theory

Entropy In Thermodynamics And Information Theory

There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in A Mathematical Theory of Communication.

This article explores what links there are between the two concepts, and how far they can be regarded as connected.

Read more about Entropy In Thermodynamics And Information Theory:  Theoretical Relationship, Negentropy, Black Holes, Quantum Theory, The Fluctuation Theorem

Famous quotes containing the words entropy, information and/or theory:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    As information technology restructures the work situation, it abstracts thought from action.
    Shoshana Zuboff (b. 1951)

    A theory of the middle class: that it is not to be determined by its financial situation but rather by its relation to government. That is, one could shade down from an actual ruling or governing class to a class hopelessly out of relation to government, thinking of gov’t as beyond its control, of itself as wholly controlled by gov’t. Somewhere in between and in gradations is the group that has the sense that gov’t exists for it, and shapes its consciousness accordingly.
    Lionel Trilling (1905–1975)