Chain Rule
Assume that the combined system determined by two random variables X and Y has entropy, that is, we need bits of information to describe its exact state. Now if we first learn the value of, we have gained bits of information. Once is known, we only need bits to describe the state of the whole system. This quantity is exactly, which gives the chain rule of conditional probability:
Formally, the chain rule indeed follows from the above definition of conditional probability:
Read more about this topic: Conditional Entropy
Famous quotes containing the words chain and/or rule:
“The conclusion suggested by these arguments might be called the paradox of theorizing. It asserts that if the terms and the general principles of a scientific theory serve their purpose, i. e., if they establish the definite connections among observable phenomena, then they can be dispensed with since any chain of laws and interpretive statements establishing such a connection should then be replaceable by a law which directly links observational antecedents to observational consequents.”
—C.G. (Carl Gustav)
“Fatalism, whose solving word in all crises of behavior is All striving is vain, will never reign supreme, for the impulse to take life strivingly is indestructible in the race. Moral creeds which speak to that impulse will be widely successful in spite of inconsistency, vagueness, and shadowy determination of expectancy. Man needs a rule for his will, and will invent one if one be not given him.”
—William James (18421910)