Conditional Entropy - Chain Rule

Chain Rule

Assume that the combined system determined by two random variables X and Y has entropy, that is, we need bits of information to describe its exact state. Now if we first learn the value of, we have gained bits of information. Once is known, we only need bits to describe the state of the whole system. This quantity is exactly, which gives the chain rule of conditional probability:

Formally, the chain rule indeed follows from the above definition of conditional probability:

\begin{align}
H(Y|X)=&\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log \frac {p(x)} {p(x,y)}\\ =&-\sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log\,p(x,y) + \sum_{x\in\mathcal X, y\in\mathcal Y}p(x,y)\log\,p(x) \\
=& H(X,Y) + \sum_{x \in \mathcal X} p(x)\log\,p(x) \\
=& H(X,Y) - H(X).
\end{align}

Read more about this topic:  Conditional Entropy

Famous quotes containing the words chain and/or rule:

    To avoid tripping on the chain of the past, you have to pick it up and wind it about you.
    Mason Cooley (b. 1927)

    Let the amelioration in our laws of property proceed from the concession of the rich, not from the grasping of the poor. Let us understand that the equitable rule is, that no one should take more than his share, let him be ever so rich.
    Ralph Waldo Emerson (1803–1882)