Entropy (information Theory) - Relative Entropy

Another useful measure of entropy that works equally well in the discrete and the continuous case is the relative entropy of a distribution. It is defined as the Kullback-Leibler divergence from the distribution to a reference measure m as follows. Assume that a probability distribution p is absolutely continuous with respect to a measure m, i.e. is of the form p(dx) = f(x)m(dx) for some non negative m-integrable function f with m-integral 1, then the relative entropy can be defined as

In this form the relative entropy generalises (up to change in sign) both the discrete entropy, where the measure m is the counting measure, and the differential entropy, where the measure m is the Lebesgue measure. If the measure m is itself a probability distribution, the relative entropy is non negative, and zero iff p = m as measures. It is defined for any measure space, hence coordinate independent and invariant under co-ordinate reparameterizations if one properly takes into account the transformation of the measure m. The relative entropy, and implicitly entropy and differential entropy, do depend on the "reference" measure m.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the words relative and/or entropy:

    Excellence or virtue is a settled disposition of the mind that determines our choice of actions and emotions and consists essentially in observing the mean relative to us ... a mean between two vices, that which depends on excess and that which depends on defect.
    Aristotle (384–323 B.C.)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)