Priors

Priors

In Bayesian statistical inference, a prior probability distribution, often called simply the prior, of an uncertain quantity p (for example, suppose p is the proportion of voters who will vote for the politician named Smith in a future election) is the probability distribution that would express one's uncertainty about p before the "data" (for example, an opinion poll) is taken into account. It is meant to attribute uncertainty rather than randomness to the uncertain quantity. The unknown quantity may be a parameter or latent variable.

One applies Bayes' theorem, multiplying the prior by the likelihood function and then normalizing, to get the posterior probability distribution, which is the conditional distribution of the uncertain quantity given the data.

A prior is often the purely subjective assessment of an experienced expert. Some will choose a conjugate prior when they can, to make calculation of the posterior distribution easier.

Parameters of prior distributions are called hyperparameters, to distinguish them from parameters of the model of the underlying data. For instance, if one is using a beta distribution to model the distribution of the parameter p of a Bernoulli distribution, then:

  • p is a parameter of the underlying system (Bernoulli distribution), and
  • α and β are parameters of the prior distribution (beta distribution), hence hyperparameters.

Read more about Priors:  Informative Priors, Uninformative Priors, Improper Priors, Other Priors