Binary Entropy Function - Explanation

Explanation

In terms of information theory, entropy is considered to be a measure of the uncertainty in a message. To put it intuitively, suppose . At this probability, the event is certain never to occur, and so there is no uncertainty at all, leading to an entropy of 0. If, the result is again certain, so the entropy is 0 here as well. When, the uncertainty is at a maximum; if one were to place a fair bet on the outcome in this case, there is no advantage to be gained with prior knowledge of the probabilities. In this case, the entropy is maximum at a value of 1 bit. Intermediate values fall between these cases; for instance, if, there is still a measure of uncertainty on the outcome, but one can still predict the outcome correctly more often than not, so the uncertainty measure, or entropy, is less than 1 full bit.

Read more about this topic:  Binary Entropy Function

Famous quotes containing the word explanation:

    Are cans constitutionally iffy? Whenever, that is, we say that we can do something, or could do something, or could have done something, is there an if in the offing—suppressed, it may be, but due nevertheless to appear when we set out our sentence in full or when we give an explanation of its meaning?
    —J.L. (John Langshaw)

    Auden, MacNeice, Day Lewis, I have read them all,
    Hoping against hope to hear the authentic call . . .
    And know the explanation I must pass is this
    MYou cannot light a match on a crumbling wall.
    Hugh MacDiarmid (1892–1978)

    We live between two worlds; we soar in the atmosphere; we creep upon the soil; we have the aspirations of creators and the propensities of quadrupeds. There can be but one explanation of this fact. We are passing from the animal into a higher form, and the drama of this planet is in its second act.
    W. Winwood Reade (1838–1875)