Entropy (information Theory) - Example

Example

Consider tossing a coin with known, not necessarily fair, probabilities of coming up heads or tails; this is known as the Bernoulli process.

The entropy of the unknown result of the next toss of the coin is maximized if the coin is fair (that is, if heads and tails both have equal probability 1/2). This is the situation of maximum uncertainty as it is most difficult to predict the outcome of the next toss; the result of each toss of the coin delivers a full 1 bit of information.

However, if we know the coin is not fair, but comes up heads or tails with probabilities p and q, where pq, then there is less uncertainty. Every time it is tossed, one side is more likely to come up than the other. The reduced uncertainty is quantified in a lower entropy: on average each toss of the coin delivers less than a full 1 bit of information.

The extreme case is that of a double-headed coin that never comes up tails, or a double-tailed coin that never results in a head. Then there is no uncertainty. The entropy is zero: each toss of the coin delivers no information. In this respect, entropy can be normalized by dividing it by information length. The measure is called metric entropy and allows to measure the randomness of the information.

Read more about this topic:  Entropy (information Theory)

Famous quotes containing the word example:

    Our intellect is not the most subtle, the most powerful, the most appropriate, instrument for revealing the truth. It is life that, little by little, example by example, permits us to see that what is most important to our heart, or to our mind, is learned not by reasoning but through other agencies. Then it is that the intellect, observing their superiority, abdicates its control to them upon reasoned grounds and agrees to become their collaborator and lackey.
    Marcel Proust (1871–1922)