Definition
Named after Boltzmann's H-theorem, Shannon denoted the entropy H of a discrete random variable X with possible values {x1, ..., xn} and probability mass function P(X) as,
Here E is the expected value operator, and I is the information content of X.
I(X) is itself a random variable. The entropy can explicitly be written as
where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10, and the unit of entropy is bit for b = 2, nat for b = e, and dit (or digit) for b = 10.
In the case of p(xi) = 0 for some i, the value of the corresponding summand 0 logb 0 is taken to be 0, which is consistent with the well-known limit:
- .
Read more about this topic: Entropy (information Theory)
Famous quotes containing the word definition:
“The definition of good prose is proper words in their proper places; of good verse, the most proper words in their proper places. The propriety is in either case relative. The words in prose ought to express the intended meaning, and no more; if they attract attention to themselves, it is, in general, a fault.”
—Samuel Taylor Coleridge (17721834)
“Im beginning to think that the proper definition of Man is an animal that writes letters.”
—Lewis Carroll [Charles Lutwidge Dodgson] (18321898)
“Mothers often are too easily intimidated by their childrens negative reactions...When the child cries or is unhappy, the mother reads this as meaning that she is a failure. This is why it is so important for a mother to know...that the process of growing up involves by definition things that her child is not going to like. Her job is not to create a bed of roses, but to help him learn how to pick his way through the thorns.”
—Elaine Heffner (20th century)