Max Ent - Testable Information

The principle of maximum entropy is useful explicitly only when applied to testable information. A piece of information is testable if it can be determined whether a given distribution is consistent with it. For example, the statements

The expectation of the variable x is 2.87

and

p2 + p3 > 0.6

are statements of testable information.

Given testable information, the maximum entropy procedure consists of seeking the probability distribution which maximizes information entropy, subject to the constraints of the information. This constrained optimization problem is typically solved using the method of Lagrange multipliers.

Entropy maximization with no testable information takes place under a single constraint: the sum of the probabilities must be one. Under this constraint, the maximum entropy discrete probability distribution is the uniform distribution,

The principle of maximum entropy can thus be seen as a generalization of the classical principle of indifference, also known as the principle of insufficient reason.

Read more about this topic:  Max Ent

Famous quotes containing the word information:

    Computers are good at swift, accurate computation and at storing great masses of information. The brain, on the other hand, is not as efficient a number cruncher and its memory is often highly fallible; a basic inexactness is built into its design. The brain’s strong point is its flexibility. It is unsurpassed at making shrewd guesses and at grasping the total meaning of information presented to it.
    Jeremy Campbell (b. 1931)