Quantities of Information - Entropy

The entropy of a discrete message space is a measure of the amount of uncertainty one has about which message will be chosen. It is defined as the average self-information of a message from that message space:

where

denotes the expected value operation.

An important property of entropy is that it is maximized when all the messages in the message space are equiprobable (e.g. ). In this case .

Sometimes the function H is expressed in terms of the probabilities of the distribution:

where each and

An important special case of this is the binary entropy function:

Read more about this topic:  Quantities Of Information

Famous quotes containing the word entropy:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)