Categorical Distribution - With A Conjugate Prior - Posterior Conditional Distribution

Posterior Conditional Distribution

In Gibbs sampling, we typically need to draw from conditional distributions in multi-variable Bayes networks where each variable is conditioned on all the others. In networks that include categorical variables with Dirichlet priors (e.g. mixture models and models including mixture components), the Dirichlet distributions are often "collapsed out" (marginalized out) of the network, which introduces dependencies among the various categorical nodes dependent on a given prior (specifically, their joint distribution is a Dirichlet-multinomial distribution). One of the reasons for doing this is that in such a case, the distribution of one categorical node given the others is exactly the posterior predictive distribution of the remaining nodes.

That is, for a set of nodes, if we denote the node in question as and the remainder as, then


\begin{align}
p(x_n=i\mid\mathbb{X}^{(-n)},\boldsymbol{\alpha}) &=\, \frac{c_i^{(-n)} + \alpha_i}{N-1+\sum_i \alpha_i}
&\propto\, c_i^{(-n)} + \alpha_i \\
\end{align}

where is the number of nodes having category i among the nodes other than node n.

Read more about this topic:  Categorical Distribution, With A Conjugate Prior

Famous quotes containing the words conditional and/or distribution:

    Computer mediation seems to bathe action in a more conditional light: perhaps it happened; perhaps it didn’t. Without the layered richness of direct sensory engagement, the symbolic medium seems thin, flat, and fragile.
    Shoshana Zuboff (b. 1951)

    There is the illusion of time, which is very deep; who has disposed of it? Mor come to the conviction that what seems the succession of thought is only the distribution of wholes into causal series.
    Ralph Waldo Emerson (1803–1882)