Variational Message Passing - Likelihood Lower Bound

Likelihood Lower Bound

Given some set of hidden variables and observed variables, the goal of approximate inference is to lower-bound the probability that a graphical model is in the configuration . Over some probability distribution (to be defined later),

.

So, if we define our lower bound to be

,

then the likelihood is simply this bound plus the relative entropy between and . Because the relative entropy is non-negative, the function defined above is indeed a lower bound of the log likelihood of our observation . The distribution will have a simpler character than that of because marginalizing over is intractable for all but the simplest of graphical models. In particular, VMP uses a factorized distribution :

where is a disjoint part of the graphical model.

Read more about this topic:  Variational Message Passing

Famous quotes containing the words likelihood and/or bound:

    Sustained unemployment not only denies parents the opportunity to meet the food, clothing, and shelter needs of their children but also denies them the sense of adequacy, belonging, and worth which being able to do so provides. This increases the likelihood of family problems and decreases the chances of many children to be adequately prepared for school.
    James P. Comer (20th century)

    After which you led me to water
    And bade me drink, which I did, owing to your kindness.
    You would not let me out for two days and three nights,
    Bringing me books bound in wild thyme and scented wild grasses
    As if reading had any interest for me ...
    John Ashbery (b. 1927)