Entropy (information Theory) - Definition

Definition

Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,

Here E is the expected value operator, and I is the information content of X.

I(X) is itself a random variable. The entropy can explicitly be written as

where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.

In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:

.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the word definition:

    ... we all know the wag’s definition of a philanthropist: a man whose charity increases directly as the square of the distance.
    George Eliot [Mary Ann (or Marian)

    The definition of good prose is proper words in their proper places; of good verse, the most proper words in their proper places. The propriety is in either case relative. The words in prose ought to express the intended meaning, and no more; if they attract attention to themselves, it is, in general, a fault.
    Samuel Taylor Coleridge (1772–1834)

    Although there is no universal agreement as to a definition of life, its biological manifestations are generally considered to be organization, metabolism, growth, irritability, adaptation, and reproduction.
    The Columbia Encyclopedia, Fifth Edition, the first sentence of the article on “life” (based on wording in the First Edition, 1935)