Mutual Information

In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.

Read more about Mutual Information:  Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information

Famous quotes containing the words mutual and/or information:

    If the study of all these sciences, which we have enumerated, should ever bring us to their mutual association and relationship, and teach us the nature of the ties which bind them together, I believe that the diligent treatment of them will forward the objects which we have in view, and that the labor, which otherwise would be fruitless, will be well bestowed.
    Plato (c. 427–347 B.C.)

    If you have any information or evidence regarding the O.J. Simpson case, press 2 now. If you are an expert in fields relating to the O.J. Simpson case and would like to offer your services, press 3 now. If you would like the address where you can send a letter of support to O.J. Simpson, press 1 now. If you are seeking legal representation from the law offices of Robert L. Shapiro, press 4 now.
    Advertisement. Aired August 8, 1994 by Tom Snyder on TV station CNBC. Chicago Sun Times, p. 11 (July 24, 1994)