Bayesian Network - Statistical Introduction

Statistical Introduction

Given data and parameter, a simple Bayesian analysis starts with a prior probability (prior) and likelihood to compute a posterior probability .

Often the prior on depends in turn on other parameters that are not mentioned in the likelihood. So, the prior must be replaced by a likelihood, and a prior on the newly introduced parameters is required, resulting in a posterior probability

This is the simplest example of a hierarchical Bayes model.]? Link to whichever one it is. from October 2009">clarification needed]]]

The process may be repeated; for example, the parameters may depend in turn on additional parameters, which will require their own prior. Eventually the process must terminate, with priors that do not depend on any other unmentioned parameters.

Read more about this topic:  Bayesian Network

Famous quotes containing the word introduction:

    For the introduction of a new kind of music must be shunned as imperiling the whole state; since styles of music are never disturbed without affecting the most important political institutions.
    Plato (c. 427–347 B.C.)