In probability theory and information theory, the mutual information (sometimes known by the archaic term transinformation) of two random variables is a quantity that measures the mutual dependence of the two random variables. The most common unit of measurement of mutual information is the bit, when logarithms to the base 2 are used.
Read more about Mutual Information: Definition of Mutual Information, Relation To Other Quantities, Variations of Mutual Information, Applications of Mutual Information
Famous quotes containing the words mutual and/or information:
“Marry first, and love will come after is a shocking assertion; since a thousand things may happen to make the state but barely tolerable, when it is entered into with mutual affection.”
—Samuel Richardson (16891761)
“I was brought up to believe that the only thing worth doing was to add to the sum of accurate information in the world.”
—Margaret Mead (19011978)