Entropy (information Theory) - Definition

Definition

Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,

Here E is the expected value operator, and I is the information content of X.

I(X) is itself a random variable. The entropy can explicitly be written as

where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.

In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:

.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the word definition:

    Scientific method is the way to truth, but it affords, even in
    principle, no unique definition of truth. Any so-called pragmatic
    definition of truth is doomed to failure equally.
    Willard Van Orman Quine (b. 1908)

    According to our social pyramid, all men who feel displaced racially, culturally, and/or because of economic hardships will turn on those whom they feel they can order and humiliate, usually women, children, and animals—just as they have been ordered and humiliated by those privileged few who are in power. However, this definition does not explain why there are privileged men who behave this way toward women.
    Ana Castillo (b. 1953)

    No man, not even a doctor, ever gives any other definition of what a nurse should be than this—”devoted and obedient.” This definition would do just as well for a porter. It might even do for a horse. It would not do for a policeman.
    Florence Nightingale (1820–1910)