Mean Squared Error

Mean Squared Error

In statistics, the mean squared error (MSE) of an estimator is one of many ways to quantify the difference between values implied by an estimator and the true values of the quantity being estimated. MSE is a risk function, corresponding to the expected value of the squared error loss or quadratic loss. MSE measures the average of the squares of the "errors." The error is the amount by which the value implied by the estimator differs from the quantity to be estimated. The difference occurs because of randomness or because the estimator doesn't account for information that could produce a more accurate estimate.

The MSE is the second moment (about the origin) of the error, and thus incorporates both the variance of the estimator and its bias. For an unbiased estimator, the MSE is the variance of the estimator. Like the variance, MSE has the same units of measurement as the square of the quantity being estimated. In an analogy to standard deviation, taking the square root of MSE yields the root mean square error or root mean square deviation (RMSE or RMSD), which has the same units as the quantity being estimated; for an unbiased estimator, the RMSE is the square root of the variance, known as the standard deviation.

Read more about Mean Squared Error:  Definition and Basic Properties, Alternative Usages, Examples, Interpretation, Applications, As A Loss Function

Famous quotes containing the words squared and/or error:

    Dreams are toys.
    Yet for this once, yea, superstitiously,
    I will be squared by this.
    William Shakespeare (1564–1616)

    Truth is one, but error proliferates. Man tracks it down and cuts it up into little pieces hoping to turn it into grains of truth. But the ultimate atom will always essentially be an error, a miscalculation.
    René Daumal (1908–1944)