Statistical Introduction
Given data and parameter, a simple Bayesian analysis starts with a prior probability (prior) and likelihood to compute a posterior probability .
Often the prior on depends in turn on other parameters that are not mentioned in the likelihood. So, the prior must be replaced by a likelihood, and a prior on the newly introduced parameters is required, resulting in a posterior probability
This is the simplest example of a hierarchical Bayes model.]? Link to whichever one it is. from October 2009">clarification needed]]]
The process may be repeated; for example, the parameters may depend in turn on additional parameters, which will require their own prior. Eventually the process must terminate, with priors that do not depend on any other unmentioned parameters.
Read more about this topic: Bayesian Network
Famous quotes containing the word introduction:
“Such is oftenest the young mans introduction to the forest, and the most original part of himself. He goes thither at first as a hunter and fisher, until at last, if he has the seeds of a better life in him, he distinguishes his proper objects, as a poet or naturalist it may be, and leaves the gun and fish-pole behind. The mass of men are still and always young in this respect.”
—Henry David Thoreau (18171862)