Negentropy - Information Theory

Information Theory

In information theory and statistics, negentropy is used as a measure of distance to normality. Out of all distributions with a given variance, the normal or Gaussian distribution is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes if and only if the signal is Gaussian.

Negentropy is defined as

where is the differential entropy of the Gaussian density with the same mean and variance as and is the differential entropy of :

Negentropy is used in statistics and signal processing. It is related to network entropy, which is used in Independent Component Analysis.

Read more about this topic:  Negentropy

Famous quotes containing the words information and/or theory:

    Theories of child development and guidelines for parents are not cast in stone. They are constantly changing and adapting to new information and new pressures. There is no “right” way, just as there are no magic incantations that will always painlessly resolve a child’s problems.
    Lawrence Kutner (20th century)

    Hygiene is the corruption of medicine by morality. It is impossible to find a hygienest who does not debase his theory of the healthful with a theory of the virtuous.... The true aim of medicine is not to make men virtuous; it is to safeguard and rescue them from the consequences of their vices.
    —H.L. (Henry Lewis)