Variational Message Passing - Likelihood Lower Bound

Likelihood Lower Bound

Given some set of hidden variables and observed variables, the goal of approximate inference is to lower-bound the probability that a graphical model is in the configuration . Over some probability distribution (to be defined later),

.

So, if we define our lower bound to be

,

then the likelihood is simply this bound plus the relative entropy between and . Because the relative entropy is non-negative, the function defined above is indeed a lower bound of the log likelihood of our observation . The distribution will have a simpler character than that of because marginalizing over is intractable for all but the simplest of graphical models. In particular, VMP uses a factorized distribution :

where is a disjoint part of the graphical model.

Read more about this topic:  Variational Message Passing

Famous quotes containing the words likelihood and/or bound:

    What likelihood is there of corrupting a man who has no ambition?
    Samuel Richardson (1689–1761)

    I ask you to join in a re-United States. We need to empower our people so they can take more responsibility for their own lives in a world that is ever smaller, where everyone counts.... We need a new spirit of community, a sense that we are all in this together, or the American Dream will continue to wither. Our destiny is bound up with the destiny of every other American.
    Bill Clinton (b. 1946)