Mean Squared Error - Examples

Examples

Suppose we have a random sample of size n from a population, . The usual estimator for the mean is the sample average

which has an expected value of μ (so it is unbiased) and a mean square error of

For a Gaussian distribution this is the best unbiased estimator (that is, it has the lowest MSE among all unbiased estimators), but not, say, for a uniform distribution.

The usual estimator for the variance is

S^2_{n-1} = \frac{1}{n-1}\sum_{i=1}^n\left(X_i-\overline{X}\,\right)^2
=\frac{1}{n-1}\left(\sum_{i=1}^n X_i^2-n\overline{X}^2\right).

This is unbiased (its expected value is ), and its MSE is

\begin{align}\operatorname{MSE}(S^2_{n-1})&= \frac{1}{n} \left(\mu_4-\frac{n-3}{n-1}\sigma^4\right) \\
&=\frac{1}{n} \left(\gamma_2+\frac{2n}{n-1}\right)\sigma^4,\end{align}

where is the fourth central moment of the distribution or population and is the excess kurtosis.

However, one can use other estimators for which are proportional to, and an appropriate choice can always give a lower mean square error. If we define

\begin{align}S^2_a &= \frac{n-1}{a}S^2_{n-1}\\
&= \frac{1}{a}\sum_{i=1}^n\left(X_i-\overline{X}\,\right)^2\end{align}

then the MSE is

\begin{align}
\operatorname{MSE}(S^2_a)&=\operatorname{E}\left(\left(\frac{n-1}{a} S^2_{n-1}-\sigma^2\right)^2 \right) \\
&=\frac{n-1}{n a^2}\sigma^4-\frac{2(n-1)}{a}\sigma^4+\sigma^4
\end{align}

This is minimized when

For a Gaussian distribution, where, this means the MSE is minimized when dividing the sum by, whereas for a Bernoulli distribution with p = 1/2 (a coin flip), the MSE is minimized for . (Note that this particular case of the Bernoulli distribution has the lowest possible excess kurtosis; this can be proved by Jensen's inequality as follows. The fourth central moment is an upper bound for the square of variance, so that the least value for their ratio is one, therefore, the least value for the excess kurtosis is -2, achieved, for instance, by a Bernoulli with p=1/2.) So no matter what the kurtosis, we get a "better" estimate (in the sense of having a lower MSE) by scaling down the unbiased estimator a little bit. Even among unbiased estimators, if the distribution is not Gaussian the best (minimum mean square error) estimator of the variance may not be

The following table gives several estimators of the true parameters of the population, μ and σ2, for the Gaussian case.

True value Estimator Mean squared error
θ = μ = the unbiased estimator of the population mean,
θ = σ2 = the unbiased estimator of the population variance,
θ = σ2 = the biased estimator of the population variance,
θ = σ2 = the biased estimator of the population variance,

Note that:

  1. The MSEs shown for the variance estimators assume i.i.d. so that . The result for follows easily from the variance that is .
  2. Unbiased estimators may not produce estimates with the smallest total variation (as measured by MSE): the MSE of is larger than that of or .
  3. Estimators with the smallest total variation may produce biased estimates: typically underestimates σ2 by

Read more about this topic:  Mean Squared Error

Famous quotes containing the word examples:

    In the examples that I here bring in of what I have [read], heard, done or said, I have refrained from daring to alter even the smallest and most indifferent circumstances. My conscience falsifies not an iota; for my knowledge I cannot answer.
    Michel de Montaigne (1533–1592)

    No rules exist, and examples are simply life-savers answering the appeals of rules making vain attempts to exist.
    André Breton (1896–1966)

    It is hardly to be believed how spiritual reflections when mixed with a little physics can hold people’s attention and give them a livelier idea of God than do the often ill-applied examples of his wrath.
    —G.C. (Georg Christoph)