Multivariate Normal Distribution - Bayesian Inference

Bayesian Inference

In Bayesian statistics, the conjugate prior of the mean vector is another multivariate normal distribution, and the conjugate prior of the covariance matrix is an inverse-Wishart distribution . Suppose then that n observations have been made

and that a conjugate prior has been assigned, where

where

and

Then,


\begin{array}{rcl}
p(\boldsymbol\mu\mid\boldsymbol\Sigma,\mathbf{X}) & \sim & \mathcal{N}\left(\frac{n\bar{\mathbf{x}} + m\boldsymbol\mu_0}{n+m},\frac{1}{n+m}\boldsymbol\Sigma\right),\\
p(\boldsymbol\Sigma\mid\mathbf{X}) & \sim & \mathcal{W}^{-1}\left(\boldsymbol\Psi+n\mathbf{S}+\frac{nm}{n+m}(\bar{\mathbf{x}}-\boldsymbol\mu_0)(\bar{\mathbf{x}}-\boldsymbol\mu_0)', n+n_0\right),
\end{array}

where


\begin{array}{rcl}
\bar{\mathbf{x}} & = & n^{-1}\sum_{i=1}^{n} \mathbf{x}_i ,\\
\mathbf{S} & = & n^{-1}\sum_{i=1}^{n} (\mathbf{x}_i - \bar{\mathbf{x}})(\mathbf{x}_i - \bar{\mathbf{x}})' .
\end{array}

Read more about this topic:  Multivariate Normal Distribution

Famous quotes containing the word inference:

    I shouldn’t want you to be surprised, or to draw any particular inference from my making speeches, or not making speeches, out there. I don’t recall any candidate for President that ever injured himself very much by not talking.
    Calvin Coolidge (1872–1933)