Definition
Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,
Here E is the expected value operator, and I is the information content of X.
I(X) is itself a random variable. The entropy can explicitly be written as
where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.
In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:
- .
Read more about this topic: Entropy (information Theory)
Famous quotes containing the word definition:
“No man, not even a doctor, ever gives any other definition of what a nurse should be than thisdevoted and obedient. This definition would do just as well for a porter. It might even do for a horse. It would not do for a policeman.”
—Florence Nightingale (18201910)
“Perhaps the best definition of progress would be the continuing efforts of men and women to narrow the gap between the convenience of the powers that be and the unwritten charter.”
—Nadine Gordimer (b. 1923)
“... we all know the wags definition of a philanthropist: a man whose charity increases directly as the square of the distance.”
—George Eliot [Mary Ann (or Marian)