The principle of maximum entropy is useful explicitly only when applied to testable information. A piece of information is testable if it can be determined whether a given distribution is consistent with it. For example, the statements
- The expectation of the variable x is 2.87
and
- p2 + p3 > 0.6
are statements of testable information.
Given testable information, the maximum entropy procedure consists of seeking the probability distribution which maximizes information entropy, subject to the constraints of the information. This constrained optimization problem is typically solved using the method of Lagrange multipliers.
Entropy maximization with no testable information takes place under a single constraint: the sum of the probabilities must be one. Under this constraint, the maximum entropy discrete probability distribution is the uniform distribution,
The principle of maximum entropy can thus be seen as a generalization of the classical principle of indifference, also known as the principle of insufficient reason.
Read more about this topic: Principle Of Maximum Entropy
Famous quotes containing the word information:
“Many more children observe attitudes, values and ways different from or in conflict with those of their families, social networks, and institutions. Yet todays young people are no more mature or capable of handling the increased conflicting and often stimulating information they receive than were young people of the past, who received the information and had more adult control of and advice about the information they did receive.”
—James P. Comer (20th century)