Quantities of Information

The mathematical theory of information is based on probability theory and statistics, and measures information with several quantities of information. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, based on the binary logarithm. Other units include the nat, based on the natural logarithm, and the hartley, based on the base 10 or common logarithm.

In what follows, an expression of the form is considered by convention to be equal to zero whenever p is zero. This is justified because for any logarithmic base.

Read more about Quantities Of Information:  Self-information, Entropy, Joint Entropy, Conditional Entropy (equivocation), Kullback–Leibler Divergence (information Gain), Mutual Information (transinformation), Differential Entropy

Famous quotes containing the words quantities of, quantities and/or information:

    The Walrus and the Carpenter
    Were walking close at hand:
    They wept like anything to see
    Such quantities of sand:
    “If this were only cleared away,”
    They said, “it would be grand!”
    “If seven maids with seven mops
    Swept it for half a year,
    Do you suppose,” the Walrus said,
    “That they could get it clear?”
    “I doubt it,” said the Carpenter,
    And shed a bitter tear.
    Lewis Carroll [Charles Lutwidge Dodgson] (1832–1898)

    The Walrus and the Carpenter
    Were walking close at hand:
    They wept like anything to see
    Such quantities of sand:
    “If this were only cleared away,”
    They said, “it would be grand!”
    “If seven maids with seven mops
    Swept it for half a year,
    Do you suppose,” the Walrus said,
    “That they could get it clear?”
    “I doubt it,” said the Carpenter,
    And shed a bitter tear.
    Lewis Carroll [Charles Lutwidge Dodgson] (1832–1898)

    The information links are like nerves that pervade and help to animate the human organism. The sensors and monitors are analogous to the human senses that put us in touch with the world. Data bases correspond to memory; the information processors perform the function of human reasoning and comprehension. Once the postmodern infrastructure is reasonably integrated, it will greatly exceed human intelligence in reach, acuity, capacity, and precision.
    Albert Borgman, U.S. educator, author. Crossing the Postmodern Divide, ch. 4, University of Chicago Press (1992)