Conditional Entropy - Chain Rule

Chain Rule

Assume that the combined system determined by two random variables X and Y has entropy, that is, we need bits of information to describe its exact state. Now if we first learn the value of, we have gained bits of information. Once is known, we only need bits to describe the state of the whole system. This quantity is exactly, which gives the chain rule of conditional probability:

Formally, the chain rule indeed follows from the above definition of conditional probability:

\begin{align}
H(Y|X)=&\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log \frac {p(x)} {p(x,y)}\\ =&-\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log\,p(x,y) + \sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log\,p(x) \\
=& H(X,Y) + \sum_{x \in \mathcal X} p(x)\log\,p(x) \\
=& H(X,Y) - H(X).
\end{align}

Read more about this topic:  Conditional Entropy

Famous quotes containing the words chain and/or rule:

    By this unprincipled facility of changing the state as often, and as much, and in as many ways as there are floating fancies or fashions, the whole chain and continuity of the commonwealth would be broken. No one generation could link with the other. Men would become little better than the flies of a summer.
    Edmund Burke (1729–1797)

    A right rule for a club would be,—Admit no man whose presence excludes any one topic.
    Ralph Waldo Emerson (1803–1882)