The principle of maximum entropy is useful explicitly only when applied to testable information. A piece of information is testable if it can be determined whether a given distribution is consistent with it. For example, the statements
- The expectation of the variable x is 2.87
and
- p2 + p3 > 0.6
are statements of testable information.
Given testable information, the maximum entropy procedure consists of seeking the probability distribution which maximizes information entropy, subject to the constraints of the information. This constrained optimization problem is typically solved using the method of Lagrange multipliers.
Entropy maximization with no testable information takes place under a single constraint: the sum of the probabilities must be one. Under this constraint, the maximum entropy discrete probability distribution is the uniform distribution,
The principle of maximum entropy can thus be seen as a generalization of the classical principle of indifference, also known as the principle of insufficient reason.
Read more about this topic: Principle Of Maximum Entropy
Famous quotes containing the word information:
“The family circle has widened. The worldpool of information fathered by the electric mediamovies, Telstar, flightfar surpasses any possible influence mom and dad can now bring to bear. Character no longer is shaped by only two earnest, fumbling experts. Now all the worlds a sage.”
—Marshall McLuhan (19111980)