Likelihood Principle

In statistics, the likelihood principle is a controversial principle of statistical inference which asserts that all of the information in a sample is contained in the likelihood function.

A likelihood function arises from a conditional probability distribution considered as a function of its distributional parameterization argument, conditioned on the data argument. For example, consider a model which gives the probability density function of observable random variable X as a function of a parameter θ. Then for a specific value x of X, the function L(θ | x) = P(X=x | θ) is a likelihood function of θ: it gives a measure of how "likely" any particular value of θ is, if we know that X has the value x. Two likelihood functions are equivalent if one is a scalar multiple of the other. The likelihood principle states that all information from the data relevant to inferences about the value of θ is found in the equivalence class. The strong likelihood principle applies this same criterion to cases such as sequential experiments where the sample of data that is available results from applying a stopping rule to the observations earlier in the experiment.

Read more about Likelihood Principle:  Example, The Law of Likelihood, Historical Remarks, Arguments For and Against The Likelihood Principle

Famous quotes containing the words likelihood and/or principle:

    What likelihood is there of corrupting a man who has no ambition?
    Samuel Richardson (1689–1761)

    Life is a game in which the rules are constantly changing; nothing spoils a game more than those who take it seriously. Adultery? Phooey! You should never subjugate yourself to another nor seek the subjugation of someone else to yourself. If you follow that Crispian principle you will be able to say “Phooey,” too, instead of reaching for your gun when you fancy yourself betrayed.
    Quentin Crisp (b. 1908)