Differential Entropy

Differential entropy (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of average surprisal of a random variable, to continuous probability distributions.

Read more about Differential Entropy:  Definition, Properties of Differential Entropy, Maximization in The Normal Distribution, Example: Exponential Distribution, Differential Entropies For Various Distributions, Variants

Famous quotes containing the words differential and/or entropy:

    But how is one to make a scientist understand that there is something unalterably deranged about differential calculus, quantum theory, or the obscene and so inanely liturgical ordeals of the precession of the equinoxes.
    Antonin Artaud (1896–1948)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)