Bayesian Network - Statistical Introduction

Statistical Introduction

Given data and parameter, a simple Bayesian analysis starts with a prior probability (prior) and likelihood to compute a posterior probability .

Often the prior on depends in turn on other parameters that are not mentioned in the likelihood. So, the prior must be replaced by a likelihood, and a prior on the newly introduced parameters is required, resulting in a posterior probability

This is the simplest example of a hierarchical Bayes model.]? Link to whichever one it is. from October 2009">clarification needed]]]

The process may be repeated; for example, the parameters may depend in turn on additional parameters, which will require their own prior. Eventually the process must terminate, with priors that do not depend on any other unmentioned parameters.

Read more about this topic:  Bayesian Network

Famous quotes containing the word introduction:

    Do you suppose I could buy back my introduction to you?
    S.J. Perelman, U.S. screenwriter, Arthur Sheekman, Will Johnstone, and Norman Z. McLeod. Groucho Marx, Monkey Business, a wisecrack made to his fellow stowaway Chico Marx (1931)