Inverse Probability

In probability theory, inverse probability is an obsolete term for the probability distribution of an unobserved variable.

Today, the problem of determining an unobserved variable (by whatever method) is called inferential statistics, the method of inverse probability (assigning a probability distribution to an unobserved variable) is called Bayesian probability, the "distribution" of an unobserved variable given data is rather the likelihood function (which is not a probability distribution), and the distribution of an unobserved variable, given both data and a prior distribution, is the posterior distribution. The development of the field and terminology from "inverse probability" to "Bayesian probability" is described by Fienberg (2006). The term "Bayesian", which displaced "inverse probability", was in fact introduced by R. A. Fisher as a derogatory term.

The term "inverse probability" appears in an 1837 paper of De Morgan, in reference to Laplace's method of probability (developed in a 1774 paper, which independently discovered and popularized Bayesian methods, and 1812 book), though the term "inverse probability" does not occur in these.

Inverse probability, variously interpreted, was the dominant approach to statistics until the development of frequentism in the early 20th century by R. A. Fisher, Jerzy Neyman and Egon Pearson. Following the development of frequentism, the terms frequentist and Bayesian developed to contrast these approaches, and became common in the 1950s.

Read more about Inverse Probability:  Details

Famous quotes containing the words inverse and/or probability:

    The quality of moral behaviour varies in inverse ratio to the number of human beings involved.
    Aldous Huxley (1894–1963)

    Only in Britain could it be thought a defect to be “too clever by half.” The probability is that too many people are too stupid by three-quarters.
    John Major (b. 1943)