Principle of Maximum Entropy - Testable Information

The principle of maximum entropy is useful explicitly only when applied to testable information. A piece of information is testable if it can be determined whether a given distribution is consistent with it. For example, the statements

The expectation of the variable x is 2.87

and

p2 + p3 > 0.6

are statements of testable information.

Given testable information, the maximum entropy procedure consists of seeking the probability distribution which maximizes information entropy, subject to the constraints of the information. This constrained optimization problem is typically solved using the method of Lagrange multipliers.

Entropy maximization with no testable information takes place under a single constraint: the sum of the probabilities must be one. Under this constraint, the maximum entropy discrete probability distribution is the uniform distribution,

The principle of maximum entropy can thus be seen as a generalization of the classical principle of indifference, also known as the principle of insufficient reason.

Read more about this topic:  Principle Of Maximum Entropy

Famous quotes containing the word information:

    On the breasts of a barmaid in Sale
    Were tattooed the prices of ale;
    And on her behind
    For the sake of the blind
    Was the same information in Braille.
    Anonymous.