Mutual Information

In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.

Read more about Mutual Information:  Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information

Famous quotes containing the words mutual and/or information:

    I anticipate with pleasing expectations that retreat in which I promise myself to realize, without alloy, the sweet enjoyment of partaking, in the midst of my fellow citizens, the benign influence of good laws under a free government, the ever favorite object of my heart, and the happy reward, as I trust, of our mutual cares, labors, and dangers.
    George Washington (1732–1799)

    But while ignorance can make you insensitive, familiarity can also numb. Entering the second half-century of an information age, our cumulative knowledge has changed the level of what appalls, what stuns, what shocks.
    Anna Quindlen (b. 1952)