Entropy (information Theory) - Definition

Definition

Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,

Here E is the expected value operator, and I is the information content of X.

I(X) is itself a random variable. The entropy can explicitly be written as

where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.

In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:

.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the word definition:

    ... we all know the wag’s definition of a philanthropist: a man whose charity increases directly as the square of the distance.
    George Eliot [Mary Ann (or Marian)

    Scientific method is the way to truth, but it affords, even in
    principle, no unique definition of truth. Any so-called pragmatic
    definition of truth is doomed to failure equally.
    Willard Van Orman Quine (b. 1908)

    It is very hard to give a just definition of love. The most we can say of it is this: that in the soul, it is a desire to rule; in the spirit, it is a sympathy; and in the body, it is but a hidden and subtle desire to possess—after many mysteries—what one loves.
    François, Duc De La Rochefoucauld (1613–1680)