**Maximum Entropy Derivation**

The exponential family arises naturally as the answer to the following question: what is the maximum-entropy distribution consistent with given constraints on expected values?

The information entropy of a probability distribution *dF*(*x*) can only be computed with respect to some other probability distribution (or, more generally, a positive measure), and both measures must be mutually absolutely continuous. Accordingly, we need to pick a *reference measure* *dH*(*x*) with the same support as *dF*(*x*).

The entropy of *dF*(*x*) relative to *dH*(*x*) is

or

where *dF*/*dH* and *dH*/*dF* are Radon–Nikodym derivatives. Note that the ordinary definition of entropy for a discrete distribution supported on a set *I*, namely

*assumes*, though this is seldom pointed out, that *dH* is chosen to be the counting measure on *I*.

Consider now a collection of observable quantities (random variables) *T*_{i}. The probability distribution *dF* whose entropy with respect to *dH* is greatest, subject to the conditions that the expected value of *T*_{i} be equal to *t*_{i}, is a member of the exponential family with *dH* as reference measure and (*T*_{1}, ..., *T*_{n}) as sufficient statistic.

The derivation is a simple variational calculation using Lagrange multipliers. Normalization is imposed by letting *T*_{0} = 1 be one of the constraints. The natural parameters of the distribution are the Lagrange multipliers, and the normalization factor is the Lagrange multiplier associated to *T*_{0}.

For examples of such derivations, see Maximum entropy probability distribution.

Read more about this topic: Exponential Family

### Famous quotes containing the words maximum and/or entropy:

“Probably the only place where a man can feel really secure is in a *maximum* security prison, except for the imminent threat of release.”

—Germaine Greer (b. 1939)

“Just as the constant increase of *entropy* is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against *entropy*.”

—Václav Havel (b. 1936)