Consistent Estimator - Definition

Definition

Loosely speaking, an estimator Tn of parameter θ is said to be consistent, if it converges in probability to the true value of the parameter:

 \underset{n\to\infty}{\operatorname{plim}}\;T_n = \theta.

A more rigorous definition takes into account the fact that θ is actually unknown, and thus the convergence in probability must take place for every possible value of this parameter. Suppose {pθ: θ ∈ Θ} is a family of distributions (the parametric model), and = {X1, X2, … : Xi ~ pθ} is an infinite sample from the distribution pθ. Let { Tn() } be a sequence of estimators for some parameter g(θ). Usually Tn will be based on the first n observations of a sample. Then this sequence {Tn} is said to be (weakly) consistent if

 \underset{n\to\infty}{\operatorname{plim}}\;T_n(X^{\theta}) = g(\theta),\ \ \text{for all}\ \theta\in\Theta.

This definition uses g(θ) instead of simply θ, because often one is interested in estimating a certain function or a sub-vector of the underlying parameter. In the next example we estimate the location parameter of the model, but not the scale:

Read more about this topic:  Consistent Estimator

Famous quotes containing the word definition:

    ... we all know the wag’s definition of a philanthropist: a man whose charity increases directly as the square of the distance.
    George Eliot [Mary Ann (or Marian)

    The man who knows governments most completely is he who troubles himself least about a definition which shall give their essence. Enjoying an intimate acquaintance with all their particularities in turn, he would naturally regard an abstract conception in which these were unified as a thing more misleading than enlightening.
    William James (1842–1910)

    The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.
    Jean Baudrillard (b. 1929)