Definition
If is the entropy of the variable conditioned on the variable taking a certain value, then is the result of averaging over all possible values that may take.
Given discrete random variable with support and with support, the conditional entropy of given is defined as:
Note: The supports of X and Y can be replaced by their domains if it is understood that should be treated as being equal to zero.
if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables.
Read more about this topic: Conditional Entropy
Famous quotes containing the word definition:
“Was man made stupid to see his own stupidity?
Is God by definition indifferent, beyond us all?
Is the eternal truth mans fighting soul
Wherein the Beast ravens in its own avidity?”
—Richard Eberhart (b. 1904)
“Im beginning to think that the proper definition of Man is an animal that writes letters.”
—Lewis Carroll [Charles Lutwidge Dodgson] (18321898)
“The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.”
—Jean Baudrillard (b. 1929)