Normal-gamma Distribution - Posterior Distribution of The Parameters

Posterior Distribution of The Parameters

Assume that x is distributed according to a normal distribution with unknown mean and precision .

and that the prior distribution on and, has a normal-gamma distribution


(\mu,\tau) \sim \text{NormalGamma}(\mu_0,\lambda_0,\alpha_0,\beta_0) ,

for which the density π satisfies


\pi(\mu,\tau) \propto \tau^{\alpha_0-\frac{1}{2}}\,\exp\,\exp.

Given a dataset, consisting of independent and identically distributed random_variables (i.i.d), the posterior distribution of and given this dataset can be analytically determined by Bayes' theorem. Explicitly,

,

where is the likelihood of the data given the parameters.

Since the data are i.i.d, the likelihood of the entire dataset is equal to the product of the likelihoods of the individual data samples:


\mathbf{L}(\mathbf{X} | \tau, \mu) = \prod_{i=1}^n \mathbf{L}(x_i | \tau, \mu) .

This expression can be simplified as follows:


\begin{align}
\mathbf{L}(\mathbf{X} | \tau, \mu) & \propto \prod_{i=1}^n \tau^{1/2} \exp \\ & \propto \tau^{n/2} \exp \\ & \propto \tau^{n/2} \exp \\ & \propto \tau^{n/2} \exp \\
& \propto \tau^{n/2} \exp ,
\end{align}

where, the mean of the data samples, and, the sample variance.


The posterior distribution of the parameters is proportional to the prior times the likelihood.


\begin{align}
\mathbf{P}(\tau, \mu | \mathbf{X}) &\propto \mathbf{L}(\mathbf{X} | \tau,\mu) \pi(\tau,\mu) \\
&\propto \tau^{n/2} \exp \tau^{\alpha_0-\frac{1}{2}}\,\exp\,\exp \\ &\propto \tau^{\frac{n}{2} + \alpha_0 - \frac{1}{2}}\exp \exp\left \\
\end{align}

The final exponential term is simplified by completing the square.


\begin{align}
\lambda_0(\mu-\mu_0)^2 + n(\bar{x} -\mu)^2&=\lambda_0 \mu^2 - 2 \lambda_0 \mu \mu_0 + \lambda_0 \mu_0^2 + n \mu^2 - 2 n \bar{x} \mu + n \bar{x}^2 \\
&= (\lambda_0 + n) \mu^2 - 2(\lambda_0 \mu_0 + n \bar{x}) \mu + \lambda_0 \mu_0^2 +n \bar{x}^2 \\
&= (\lambda_0 + n)( \mu^2 - 2 \frac{\lambda_0 \mu_0 + n \bar{x}}{\lambda_0 + n} \mu ) + \lambda_0 \mu_0^2 +n \bar{x}^2 \\
&= (\lambda_0 + n)\left(\mu - \frac{\lambda_0 \mu_0 + n \bar{x}}{\lambda_0 + n} \right) ^2 + \lambda_0 \mu_0^2 +n \bar{x}^2 - \left( \frac{\lambda_0 \mu_0 +n \bar{x}}{\lambda_0 + n} \right)^2 \\
&= (\lambda_0 + n)\left(\mu - \frac{\lambda_0 \mu_0 + n \bar{x}}{\lambda_0 + n} \right) ^2 + \frac{\lambda_0 n (\bar{x} - \mu_0 )^2}{\lambda_0 +n}
\end{align}

On inserting this back into the expression above,


\begin{align}
\mathbf{P}(\tau, \mu | \mathbf{X}) & \propto \tau^{\frac{n}{2} + \alpha_0 - \frac{1}{2}} \exp \left \exp \left\\
& \propto \tau^{\frac{n}{2} + \alpha_0 - \frac{1}{2}} \exp \left \exp \left
\end{align}

This final expression is in exactly the same form as a Normal-Gamma distribution, i.e.,


\mathbf{P}(\tau, \mu | \mathbf{X}) = \text{NormalGamma}\left(\frac{\lambda_0 \mu_0 + n \bar{x}}{\lambda_0 + n}, \lambda_0 + n, \alpha_0+\frac{n}{2}, \beta_0+ \frac{1}{2}\left(n s + \frac{\lambda_0 n (\bar{x} - \mu_0 )^2}{\lambda_0 +n} \right) \right)

Read more about this topic:  Normal-gamma Distribution

Famous quotes containing the words distribution and/or parameters:

    There is the illusion of time, which is very deep; who has disposed of it? Mor come to the conviction that what seems the succession of thought is only the distribution of wholes into causal series.
    Ralph Waldo Emerson (1803–1882)

    What our children have to fear is not the cars on the highways of tomorrow but our own pleasure in calculating the most elegant parameters of their deaths.
    —J.G. (James Graham)