Akaike Information Criterion

The Akaike information criterion is a measure of the relative goodness of fit of a statistical model. The AIC is grounded in the concept of information entropy, in effect offering a relative measure of the information lost when a given model is used to describe reality. It can be said to describe the tradeoff between bias and variance in model construction, or loosely speaking between accuracy and complexity of the model.

AIC values provide a means for model selection. AIC does not provide a test of a model in the sense of testing a null hypothesis; i.e. AIC can tell nothing about how well a model fits the data in an absolute sense. If all the candidate models fit poorly, AIC will not give any warning of that.

Read more about Akaike Information Criterion:  Definition, How To Apply AIC in Practice, AICc, Relevance To Chi-squared Fitting, History, Bayesian Information Criterion

Famous quotes containing the words information and/or criterion:

    We hear a great deal of lamentation these days about writers having all taken themselves to the colleges and universities where they live decorously instead of going out and getting firsthand information about life. The fact is that anybody who has survived his childhood has enough information about life to last him the rest of his days.
    Flannery O’Connor (1925–1964)

    Faith in reason as a prime motor is no longer the criterion of the sound mind, any more than faith in the Bible is the criterion of righteous intention.
    George Bernard Shaw (1856–1950)