Quantities of Information - Joint Entropy

The joint entropy of two discrete random variables and is defined as the entropy of the joint distribution of and :

If and are independent, then the joint entropy is simply the sum of their individual entropies.

(Note: The joint entropy should not be confused with the cross entropy, despite similar notations.)

Read more about this topic:  Quantities Of Information

Famous quotes containing the words joint and/or entropy:

    Such joint ownership creates a place where mothers can “father” and fathers can “mother.” It does not encourage mothers and fathers to compete with one another for “first- place parent.” Such competition is not especially good for marriage and furthermore drives kids nuts.
    Kyle D. Pruett (20th century)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)