Quantities of Information - Mutual Information (transinformation)

Mutual Information (transinformation)

It turns out that one of the most useful and important measures of information is the mutual information, or transinformation. This is a measure of how much information can be obtained about one random variable by observing another. The mutual information of relative to (which represents conceptually the average amount of information about that can be gained by observing ) is given by:

A basic property of the mutual information is that:

That is, knowing Y, we can save an average of bits in encoding X compared to not knowing Y. Mutual information is symmetric:


Mutual information can be expressed as the average Kullback–Leibler divergence (information gain) of the posterior probability distribution of X given the value of Y to the prior distribution on X:

In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution:

Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's χ2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution.

Read more about this topic:  Quantities Of Information

Famous quotes containing the words mutual and/or information:

    Religion is by no means a proper subject of conversation in mixed company; it should only be treated among a very few people of learning, for mutual instruction. It is too awful and respectable a subject to become a familiar one.
    Philip Dormer Stanhope, 4th Earl Chesterfield (1694–1773)

    I was brought up to believe that the only thing worth doing was to add to the sum of accurate information in the world.
    Margaret Mead (1901–1978)