Binary Entropy Function

In information theory, the binary entropy function, denoted or, is defined as the entropy of a Bernoulli process with probability of success p. Mathematically, the Bernoulli trial is modelled as a random variable X that can take on only two values: 0 and 1. The event is considered a success and the event is considered a failure. (These two events are mutually exclusive and exhaustive.)

If then and the entropy of X is given by

where is taken to be 0. The logarithms in this formula are usually taken (as shown in the graph) to the base 2. See binary logarithm.

When the binary entropy function attains its maximum value. This is the case of the unbiased bit, the most common unit of information entropy.

is distinguished from the entropy function by its taking a single scalar constant parameter. For tutorial purposes, in which the reader may not distinguish the appropriate function by its argument, is often used; however, this could confuse this function with the analogous function related to Rényi entropy, so (with "b" not in italics) should be used to dispel ambiguity.

Read more about Binary Entropy Function:  Explanation, Derivative, Taylor Series

Famous quotes containing the words entropy and/or function:

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)

    The mother’s and father’s attitudes toward the child correspond to the child’s own needs.... Mother has the function of making him secure in life, father has the function of teaching him, guiding him to cope with those problems with which the particular society the child has been born into confronts him.
    Erich Fromm (1900–1980)