Quantities of Information - Mutual Information (transinformation)

Mutual Information (transinformation)

It turns out that one of the most useful and important measures of information is the mutual information, or transinformation. This is a measure of how much information can be obtained about one random variable by observing another. The mutual information of relative to (which represents conceptually the average amount of information about that can be gained by observing ) is given by:

A basic property of the mutual information is that:

That is, knowing Y, we can save an average of bits in encoding X compared to not knowing Y. Mutual information is symmetric:


Mutual information can be expressed as the average Kullback–Leibler divergence (information gain) of the posterior probability distribution of X given the value of Y to the prior distribution on X:

In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution:

Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's χ2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution.

Read more about this topic:  Quantities Of Information

Famous quotes containing the words mutual and/or information:

    We call it a Society; and go about professing openly the totalest separation, isolation. Our life is not a mutual helpfulness; but rather, cloaked under due laws-of-war, named “fair competition” and so forth, it is a mutual hostility.
    Thomas Carlyle (1795–1881)

    When action grows unprofitable, gather information; when information grows unprofitable, sleep.
    Ursula K. Le Guin (b. 1929)