Entropy (information Theory) - Extending Discrete Entropy To The Continuous Case: Differential Entropy

Extending Discrete Entropy To The Continuous Case: Differential Entropy

The Shannon entropy is restricted to random variables taking discrete values. The corresponding formula for a continuous random variable with probability density function f(x) on the real line is defined by analogy, using the above form of the entropy as an expectation:

This formula is usually referred to as the continuous entropy, or differential entropy. A precursor of the continuous entropy is the expression for the functional in the H-theorem of Boltzmann.

Although the analogy between both functions is suggestive, the following question must be set: is the differential entropy a valid extension of the Shannon discrete entropy? Differential entropy lacks a number of properties that the Shannon discrete entropy has – it can even be negative – and thus corrections have been suggested, notably limiting density of discrete points.

To answer this question, we must establish a connection between the two functions:

We wish to obtain a generally finite measure as the bin size goes to zero. In the discrete case, the bin size is the (implicit) width of each of the n (finite or infinite) bins whose probabilities are denoted by pn. As we generalize to the continuous domain, we must make this width explicit.

To do this, start with a continuous function f discretized as shown in the figure. As the figure indicates, by the mean-value theorem there exists a value xi in each bin such that

and thus the integral of the function f can be approximated (in the Riemannian sense) by

where this limit and "bin size goes to zero" are equivalent.

We will denote

and expanding the logarithm, we have


\begin{align}
H^{\Delta} &= - \sum_{i=-\infty}^{\infty} \Delta f(x_i) \log \Delta f(x_i) \\ &= - \sum_{i=-\infty}^{\infty} \Delta f(x_i) \log f(x_i) -\sum_{i=-\infty}^{\infty} f(x_i) \Delta \log \Delta.
\end{align}

As, we have

and also

But note that as, therefore we need a special definition of the differential or continuous entropy:

which is, as said before, referred to as the differential entropy. This means that the differential entropy is not a limit of the Shannon entropy for . Rather, it differs from the limit of the Shannon entropy by an infinite offset.

It turns out as a result that, unlike the Shannon entropy, the differential entropy is not in general a good measure of uncertainty or information. For example, the differential entropy can be negative; also it is not invariant under continuous co-ordinate transformations.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the words extending, discrete, entropy, continuous and/or differential:

    The radiance was that of the full, setting, and blood-red moon, which now shone vividly through that once barely- discernible fissure,... extending from the roof of the building, in a zigzag direction, to the base. While I gazed, this fissure rapidly widened.
    Edgar Allan Poe (1809–1849)

    One can describe a landscape in many different words and sentences, but one would not normally cut up a picture of a landscape and rearrange it in different patterns in order to describe it in different ways. Because a photograph is not composed of discrete units strung out in a linear row of meaningful pieces, we do not understand it by looking at one element after another in a set sequence. The photograph is understood in one act of seeing; it is perceived in a gestalt.
    Joshua Meyrowitz, U.S. educator, media critic. “The Blurring of Public and Private Behaviors,” No Sense of Place: The Impact of Electronic Media on Social Behavior, Oxford University Press (1985)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    For good and evil, man is a free creative spirit. This produces the very queer world we live in, a world in continuous creation and therefore continuous change and insecurity.
    Joyce Cary (1888–1957)

    But how is one to make a scientist understand that there is something unalterably deranged about differential calculus, quantum theory, or the obscene and so inanely liturgical ordeals of the precession of the equinoxes.
    Antonin Artaud (1896–1948)