Definition of Mutual Information
Formally, the mutual information of two discrete random variables X and Y can be defined as:
where p(x,y) is the joint probability distribution function of X and Y, and and are the marginal probability distribution functions of X and Y respectively.
In the case of continuous random variables, the summation is replaced by a definite double integral:
where p(x,y) is now the joint probability density function of X and Y, and and are the marginal probability density functions of X and Y respectively.
These definitions are ambiguous because the base of the log function is not specified. To disambiguate, the function I could be parameterized as I(X,Y,b) where b is the base. Alternatively, since the most common unit of measurement of mutual information is the bit, a base of 2 could be specified.
Intuitively, mutual information measures the information that X and Y share: it measures how much knowing one of these variables reduces uncertainty about the other. For example, if X and Y are independent, then knowing X does not give any information about Y and vice versa, so their mutual information is zero. At the other extreme, if X and Y are identical then all information conveyed by X is shared with Y: knowing X determines the value of Y and vice versa. As a result, in the case of identity the mutual information is the same as the uncertainty contained in Y (or X) alone, namely the entropy of Y (or X: clearly if X and Y are identical they have equal entropy).
Mutual information is a measure of the inherent dependence expressed in the joint distribution of X and Y relative to the joint distribution of X and Y under the assumption of independence. Mutual information therefore measures dependence in the following sense: I(X; Y) = 0 if and only if X and Y are independent random variables. This is easy to see in one direction: if X and Y are independent, then p(x,y) = p(x) p(y), and therefore:
Moreover, mutual information is nonnegative (i.e. I(X;Y) ≥ 0; see below) and symmetric (i.e. I(X;Y) = I(Y;X)).
Read more about this topic: Mutual Information
Famous quotes containing the words definition of, definition, mutual and/or information:
“It is very hard to give a just definition of love. The most we can say of it is this: that in the soul, it is a desire to rule; in the spirit, it is a sympathy; and in the body, it is but a hidden and subtle desire to possessafter many mysterieswhat one loves.”
—François, Duc De La Rochefoucauld (16131680)
“According to our social pyramid, all men who feel displaced racially, culturally, and/or because of economic hardships will turn on those whom they feel they can order and humiliate, usually women, children, and animalsjust as they have been ordered and humiliated by those privileged few who are in power. However, this definition does not explain why there are privileged men who behave this way toward women.”
—Ana Castillo (b. 1953)
“Natures law says that the strong must prevent the weak from living, but only in a newspaper article or textbook can this be packaged into a comprehensible thought. In the soup of everyday life, in the mixture of minutia from which human relations are woven, it is not a law. It is a logical incongruity when both strong and weak fall victim to their mutual relations, unconsciously subservient to some unknown guiding power that stands outside of life, irrelevant to man.”
—Anton Pavlovich Chekhov (18601904)
“When action grows unprofitable, gather information; when information grows unprofitable, sleep.”
—Ursula K. Le Guin (b. 1929)


