History
While the analysis of variance reached fruition in the 20th century, antecedents extend centuries into the past according to Stigler. These include hypothesis testing, the partitioning of sums of squares, experimental techniques and the additive model. Laplace was performing hypothesis testing in the 1770s. The development of least-squares methods by Laplace and Gauss circa 1800 provided an improved method of combining observations (over the existing practices of astronomy and geodesy). It also initiated much study of the contributions to sums of squares. Laplace soon knew how to estimate a variance from a residual (rather than a total) sum of squares. By 1827 Laplace was using least squares methods to address ANOVA problems regarding measurements of atmospheric tides. Before 1800 astronomers had isolated observational errors resulting from reaction times (the "personal equation") and had developed methods of reducing the errors. The experimental methods used in the study of the personal equation were later accepted by the emerging field of psychology which developed strong (full factorial) experimental methods to which randomization and blinding were soon added. An eloquent non-mathematical explanation of the additive effects model was available in 1885.
Sir Ronald Fisher introduced the term "variance" and proposed a formal analysis of variance in a 1918 article The Correlation Between Relatives on the Supposition of Mendelian Inheritance. His first application of the analysis of variance was published in 1921. Analysis of variance became widely known after being included in Fisher's 1925 book Statistical Methods for Research Workers.
Randomization models were developed by several. The first was published in Polish by Neyman in 1923.
One of the attributes of ANOVA which ensured its early popularity was computational elegance. The structure of the additive model allows solution for the additive coefficients by simple algebra rather than by matrix calculations. In the era of mechanical calculators this simplicity was critical. The determination of statistical significance also required access to tables of the F function which were supplied by early statistics texts.
Read more about this topic: Analysis Of Variance
Famous quotes containing the word history:
“At present cats have more purchasing power and influence than the poor of this planet. Accidents of geography and colonial history should no longer determine who gets the fish.”
—Derek Wall (b. 1965)
“My good friends, this is the second time in our history that there has come back from Germany to Downing Street peace with honour. I believe it is peace for our time. We thank you from the bottom of our hearts. And now I recommend you to go home and sleep quietly in your beds.”
—Neville Chamberlain (18691940)
“The only thing worse than a liar is a liar thats also a hypocrite!
There are only two great currents in the history of mankind: the baseness which makes conservatives and the envy which makes revolutionaries.”
—Edmond De Goncourt (18221896)