Mean Squared Error
In statistics, the mean squared error (MSE) of an estimator is one of many ways to quantify the difference between values implied by an estimator and the true values of the quantity being estimated. MSE is a risk function, corresponding to the expected value of the squared error loss or quadratic loss. MSE measures the average of the squares of the "errors." The error is the amount by which the value implied by the estimator differs from the quantity to be estimated. The difference occurs because of randomness or because the estimator doesn't account for information that could produce a more accurate estimate.
The MSE is the second moment (about the origin) of the error, and thus incorporates both the variance of the estimator and its bias. For an unbiased estimator, the MSE is the variance of the estimator. Like the variance, MSE has the same units of measurement as the square of the quantity being estimated. In an analogy to standard deviation, taking the square root of MSE yields the root mean square error or root mean square deviation (RMSE or RMSD), which has the same units as the quantity being estimated; for an unbiased estimator, the RMSE is the square root of the variance, known as the standard deviation.
Read more about Mean Squared Error: Definition and Basic Properties, Alternative Usages, Examples, Interpretation, Applications, As A Loss Function
Famous quotes containing the words squared and/or error:
“Dreams are toys.
Yet for this once, yea, superstitiously,
I will be squared by this.”
—William Shakespeare (15641616)
“I have often been reproached with the aridity of my genius; a deficiency of imagination has been imputed to me as a crime; and the Pyrrhonism of my opinions has at all times rendered me notorious. Indeed, a strong relish for physical philosophy has, I fear, tinctured my mind with a very common error of this ageI mean the habit of referring occurrences, even the least susceptible of such reference, to the principles of that science.”
—Edgar Allan Poe (18091849)