Variational Message Passing - Likelihood Lower Bound

Likelihood Lower Bound

Given some set of hidden variables and observed variables, the goal of approximate inference is to lower-bound the probability that a graphical model is in the configuration . Over some probability distribution (to be defined later),

.

So, if we define our lower bound to be

,

then the likelihood is simply this bound plus the relative entropy between and . Because the relative entropy is non-negative, the function defined above is indeed a lower bound of the log likelihood of our observation . The distribution will have a simpler character than that of because marginalizing over is intractable for all but the simplest of graphical models. In particular, VMP uses a factorized distribution :

where is a disjoint part of the graphical model.

Read more about this topic:  Variational Message Passing

Famous quotes containing the words likelihood and/or bound:

    What likelihood is there of corrupting a man who has no ambition?
    Samuel Richardson (1689–1761)

    Then comes my fit again. I had else been perfect,
    Whole as the marble, founded as the rock,
    As broad and general as the casing air.
    But now I am cabined, cribbed, confined, bound in
    To saucy doubts and fears.
    William Shakespeare (1564–1616)