Akaike Information Criterion

The Akaike information criterion is a measure of the relative goodness of fit of a statistical model. The AIC is grounded in the concept of information entropy, in effect offering a relative measure of the information lost when a given model is used to describe reality. It can be said to describe the tradeoff between bias and variance in model construction, or loosely speaking between accuracy and complexity of the model.

AIC values provide a means for model selection. AIC does not provide a test of a model in the sense of testing a null hypothesis; i.e. AIC can tell nothing about how well a model fits the data in an absolute sense. If all the candidate models fit poorly, AIC will not give any warning of that.

Read more about Akaike Information Criterion:  Definition, How To Apply AIC in Practice, AICc, Relevance To Chi-squared Fitting, History, Bayesian Information Criterion

Famous quotes containing the words information and/or criterion:

    The real, then, is that which, sooner or later, information and reasoning would finally result in, and which is therefore independent of the vagaries of me and you. Thus, the very origin of the conception of reality shows that this conception essentially involves the notion of a COMMUNITY, without definite limits, and capable of a definite increase of knowledge.
    Charles Sanders Peirce (1839–1914)

    If we are to take for the criterion of truth the majority of suffrages, they ought to be gotten from those philosophic and patriotic citizens who cultivate their reason.
    James Madison (1751–1836)