Mutual Information

In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.

Read more about Mutual Information:  Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information

Famous quotes containing the words mutual and/or information:

    Ties of blood are not always ties of friendship; but friendship founded on merit, on esteem, and on mutual trust, becomes more vital and more tender when strengthened by the ties of blood.
    Philip Dormer Stanhope, 4th Earl Chesterfield (1694–1773)

    I have all my life been on my guard against the information conveyed by the sense of hearing—it being one of my earliest observations, the universal inclination of humankind is to be led by the ears, and I am sometimes apt to imagine that they are given to men as they are to pitchers, purposely that they may be carried about by them.
    Mary Wortley, Lady Montagu (1689–1762)