Information Theory and Measure Theory - Entropy As A "measure"

Entropy As A "measure"

There is an analogy between Shannon's basic "measures" of the information content of random variables and a measure over sets. Namely the joint entropy, conditional entropy, and mutual information can be considered as the measure of a set union, set difference, and set intersection, respectively (Reza pp. 106–108).

If we associate the existence of abstract sets and to arbitrary discrete random variables X and Y, somehow representing the information borne by X and Y, respectively, such that:

  • whenever X and Y are unconditionally independent, and
  • whenever X and Y are such that either one is completely determined by the other (i.e. by a bijection);

where is a signed measure over these sets, and we set:

we find that Shannon's "measure" of information content satisfies all the postulates and basic properties of a formal measure over sets, as commonly illustrated in an information diagram. This can be a handy mnemonic device in some situations, e.g.

Because the entropy, joint entropy, conditional entropy, and bivariate mutual information of discrete random variables are all nonnegative, many basic inequalities in information theory (among no more than two random variables) can be derived from this formulation by considering the measure μ to be nonnegative.

Read more about this topic:  Information Theory And Measure Theory

Famous quotes containing the words entropy and/or measure:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    To measure life learn thou betimes, and know
    Toward solid good what leads the nearest way;
    For other things mild Heaven a time ordains,

    And disapproves that care, though wise in show,
    That with superfluous burden loads the day,
    And, when God sends a cheerful hour, refrains.
    John Milton (1608–1674)