Entropy (information Theory)

Entropy (information Theory)

In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message. Entropy is typically measured in bits, nats, or bans.

Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".

Shannon's entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically distributed random variables, Shannon's source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.

A single toss of a fair coin has an entropy of one bit. A series of two fair coin tosses has an entropy of two bits. The entropy rate for the coin is one bit per toss. However, if the coin is not fair, then the uncertainty is lower (if asked to bet on the next outcome, we would bet preferentially on the most frequent result), and thus the Shannon entropy is lower. Mathematically, a sequence of coin flips (fair or not) is an example of a Bernoulli process, and its entropy is given by the binary entropy function. The entropy rate of English text is between 1.0 and 1.5 bits per letter, or as low as 0.6 to 1.3 bits per letter, according to estimates by Shannon based on human experiments.

Read more about Entropy (information Theory):  Introduction, Definition, Example, Rationale, Efficiency, Characterization, Further Properties, Extending Discrete Entropy To The Continuous Case: Differential Entropy, Relative Entropy

Famous quotes containing the word entropy:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)