Fisher Information

In mathematical statistics and information theory, the Fisher information (sometimes simply called information) can be defined as the variance of the score, or as the expected value of the observed information. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families). The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician R.A. Fisher (following some initial results by F. Y. Edgeworth). The Fisher information is also used in the calculation of the Jeffreys prior, which is used in Bayesian statistics.

The Fisher-information matrix is used to calculate the covariance matrices associated with maximum-likelihood estimates. It can also be used in the formulation of test statistics, such as the Wald test.

Read more about Fisher Information:  History, Definition, Matrix Form, Distinction From The Hessian of The Entropy

Famous quotes containing the words fisher and/or information:

    It seems to me that our three basic needs, for food and security and love, are so mixed and mingled and entwined that we cannot straightly think of one without the others. So it happens that when I write of hunger, I am really writing about love and the hunger for it, and warmth and the love of it and the hunger for it ... and then the warmth and richness and fine reality of hunger satisfied ... and it is all one.
    —M.F.K. Fisher (b. 1908)

    As information technology restructures the work situation, it abstracts thought from action.
    Shoshana Zuboff (b. 1951)