Mutual Information

In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.

Read more about Mutual Information:  Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information

Famous quotes containing the words mutual and/or information:

    Every nation ... have their refinements and grossiertes.... There is a balance ... of good and bad every where; and nothing but the knowing it is so can emancipate one half of the world from the prepossessions which it holds against the other—that [was] the advantage of travel ... it taught us mutual toleration; and mutual toleration ... taught us mutual love.
    Laurence Sterne (1713–1768)

    But while ignorance can make you insensitive, familiarity can also numb. Entering the second half-century of an information age, our cumulative knowledge has changed the level of what appalls, what stuns, what shocks.
    Anna Quindlen (b. 1952)