Akaike Information Criterion

The Akaike information criterion is a measure of the relative goodness of fit of a statistical model. The AIC is grounded in the concept of information entropy, in effect offering a relative measure of the information lost when a given model is used to describe reality. It can be said to describe the tradeoff between bias and variance in model construction, or loosely speaking between accuracy and complexity of the model.

AIC values provide a means for model selection. AIC does not provide a test of a model in the sense of testing a null hypothesis; i.e. AIC can tell nothing about how well a model fits the data in an absolute sense. If all the candidate models fit poorly, AIC will not give any warning of that.

Read more about Akaike Information Criterion:  Definition, How To Apply AIC in Practice, AICc, Relevance To Chi-squared Fitting, History, Bayesian Information Criterion

Famous quotes containing the words information and/or criterion:

    I believe it has been said that one copy of The Times contains more useful information than the whole of the historical works of Thucydides.
    Richard Cobden (1804–1865)

    I divide all literary works into two categories: Those I like and those I don’t like. No other criterion exists for me.
    Anton Pavlovich Chekhov (1860–1904)