Fisher Information Metric

In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability measures defined on a common probability space. It can be used to calculate the informational difference between measurements.

The metric is interesting in several respects. First, it can be understood to be the infinitesimal form of the relative entropy or Kullback–Leibler divergence; specifically, it is the Hessian of the divergence. Alternately, it can be understood as the metric induced by the flat space Euclidean metric, after appropriate changes of variable. When extended to complex projective Hilbert space, it becomes the Fubini–Study metric; when written in terms of mixed states, it is the quantum Bures metric.

Considered purely as a matrix, it is known as the Fisher information matrix. Considered as a measurement technique, where it is used to estimate hidden parameters in terms of observed random variables, it is known as the observed information (this is wrong: it is the expected information; the observed information is not a function of the parameters).

Read more about Fisher Information Metric:  Definition, Relation To The Kullback–Leibler Divergence, Relation To Ruppeiner Geometry, Change in Entropy, Relation To The Jensen–Shannon Divergence, As Euclidean Metric, As Fubini–Study Metric, Formal Definition

Famous quotes containing the words fisher and/or information:

    ... ostentatious dining has little dignity about it, although the combination is possible.
    —M.F.K. Fisher (1908–1992)

    Rejecting all organs of information ... but my senses, I rid myself of the Pyrrhonisms with which an indulgence in speculations hyperphysical and antiphysical so uselessly occupy and disquiet the mind.
    Thomas Jefferson (1743–1826)