Entropy (information Theory) - Relative Entropy

Another useful measure of entropy that works equally well in the discrete and the continuous case is the relative entropy of a distribution. It is defined as the Kullback-Leibler divergence from the distribution to a reference measure m as follows. Assume that a probability distribution p is absolutely continuous with respect to a measure m, i.e. is of the form p(dx) = f(x)m(dx) for some non negative m-integrable function f with m-integral 1, then the relative entropy can be defined as

In this form the relative entropy generalises (up to change in sign) both the discrete entropy, where the measure m is the counting measure, and the differential entropy, where the measure m is the Lebesgue measure. If the measure m is itself a probability distribution, the relative entropy is non negative, and zero iff p = m as measures. It is defined for any measure space, hence coordinate independent and invariant under co-ordinate reparameterizations if one properly takes into account the transformation of the measure m. The relative entropy, and implicitly entropy and differential entropy, do depend on the "reference" measure m.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the words relative and/or entropy:

    Personal change, growth, development, identity formation—these tasks that once were thought to belong to childhood and adolescence alone now are recognized as part of adult life as well. Gone is the belief that adulthood is, or ought to be, a time of internal peace and comfort, that growing pains belong only to the young; gone the belief that these are marker events—a job, a mate, a child—through which we will pass into a life of relative ease.
    Lillian Breslow Rubin (20th century)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)