Principle of Maximum Entropy

In Bayesian probability theory, the principle of maximum entropy is a prime doctrine. It states that, subject to precisely stated prior data, which must be a proposition that expresses testable information, the probability distribution which best represents the current state of knowledge is the one with largest information-theoretical entropy.

Let some precisely stated prior data or testable information about a probability distribution function be given. Consider the set of all trial probability distributions that encode the prior data. Of those, the one that maximizes the information entropy is the proper probability distribution under the given prior data.

Read more about Principle Of Maximum Entropy:  History, Overview, Testable Information, Justifications For The Principle of Maximum Entropy

Famous quotes containing the words principle of, principle, maximum and/or entropy:

    It were as wise to cast a violet into a crucible that you might discover the formal principle of its colour and odour, as seek to transfuse from one language into another the creations of a poet. The plant must spring again from its seed, or it will bear no flower—and this is the burthen of the curse of Babel.
    Percy Bysshe Shelley (1792–1822)

    Country people do not behave as if they think life is short; they live on the principle that it is long, and savor variations of the kind best appreciated if most days are the same.
    Edward Hoagland (b. 1932)

    I had a quick grasp of the secret to sanity—it had become the ability to hold the maximum of impossible combinations in one’s mind.
    Norman Mailer (b. 1923)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)