Likelihood Principle

In statistics, the likelihood principle is a controversial principle of statistical inference which asserts that all of the information in a sample is contained in the likelihood function.

A likelihood function arises from a conditional probability distribution considered as a function of its distributional parameterization argument, conditioned on the data argument. For example, consider a model which gives the probability density function of observable random variable X as a function of a parameter θ. Then for a specific value x of X, the function L(θ | x) = P(X=x | θ) is a likelihood function of θ: it gives a measure of how "likely" any particular value of θ is, if we know that X has the value x. Two likelihood functions are equivalent if one is a scalar multiple of the other. The likelihood principle states that all information from the data relevant to inferences about the value of θ is found in the equivalence class. The strong likelihood principle applies this same criterion to cases such as sequential experiments where the sample of data that is available results from applying a stopping rule to the observations earlier in the experiment.

Read more about Likelihood Principle:  Example, The Law of Likelihood, Historical Remarks, Arguments For and Against The Likelihood Principle

Famous quotes containing the words likelihood and/or principle:

    Sustained unemployment not only denies parents the opportunity to meet the food, clothing, and shelter needs of their children but also denies them the sense of adequacy, belonging, and worth which being able to do so provides. This increases the likelihood of family problems and decreases the chances of many children to be adequately prepared for school.
    James P. Comer (20th century)

    The principle that human nature, in its psychological aspects, is nothing more than a product of history and given social relations removes all barriers to coercion and manipulation by the powerful.
    Noam Chomsky (b. 1928)