Estimation Theory - Basics

Basics

To build a model, several statistical "ingredients" need to be known. These are needed to ensure the estimator has some mathematical tractability instead of being based on "good feel".

The first is a set of statistical samples taken from a random vector (RV) of size N. Put into a vector,

Secondly, there are the corresponding M parameters

which need to be established with their continuous probability density function (pdf) or its discrete counterpart, the probability mass function (pmf)

It is also possible for the parameters themselves to have a probability distribution (e.g., Bayesian statistics). It is then necessary to define the Bayesian probability

After the model is formed, the goal is to estimate the parameters, commonly denoted, where the "hat" indicates the estimate.

One common estimator is the minimum mean squared error estimator, which utilizes the error between the estimated parameters and the actual value of the parameters

as the basis for optimality. This error term is then squared and minimized for the MMSE estimator.

Read more about this topic:  Estimation Theory