Bit - Information Capacity and Information Compression

Information Capacity and Information Compression

When the information capacity of a storage system or a communication channel is presented in bits or bits per second, this often refers to binary digits, which is a computer hardware capacity to store binary code (0 or 1, up or down, current or not, etc.). Information capacity of a storage system is only an upper bound to the actual quantity of information stored therein. If the two possible values of one bit of storage are not equally likely, that bit of storage will contain less than one bit of information. Indeed, if the value is completely predictable, then the reading of that value will provide no information at all (zero entropic bits, because no resolution of uncertainty and therefore no information). If a computer file that uses n bits of storage contains only m < n bits of information, then that information can in principle be encoded in about m bits, at least on the average. This principle is the basis of data compression technology. Using an analogy, the hardware binary digits refer to the amount of storage space available (like the number of buckets available to store things), and the information content the filling, which comes in different levels of granularity (fine or coarse, that is, compressed or uncompressed information). When the granularity is finer (when information is more compressed), the same bucket can hold more.

For example, it is estimated that the combined technological capacity of the world to store information provides 1,300 exabytes of hardware digits in 2007. However, when this storage space is filled and the corresponding content is optimally compressed, this only represents 295 exabytes of information. When optimally compressed, the resulting carrying capacity approaches Shannon information or information entropy.

Read more about this topic:  Bit

Famous quotes containing the words information, capacity and/or compression:

    I am the very pattern of a modern Major-Gineral,
    I’ve information vegetable, animal, and mineral;
    I know the kings of England, and I quote the fights historical,
    From Marathon to Waterloo, in order categorical;
    Sir William Schwenck Gilbert (1836–1911)

    It is part of the educator’s responsibility to see equally to two things: First, that the problem grows out of the conditions of the experience being had in the present, and that it is within the range of the capacity of students; and, secondly, that it is such that it arouses in the learner an active quest for information and for production of new ideas. The new facts and new ideas thus obtained become the ground for further experiences in which new problems are presented.
    John Dewey (1859–1952)

    Do they [the publishers of Murphy] not understand that if the book is slightly obscure it is because it is a compression and that to compress it further can only make it more obscure?
    Samuel Beckett (1906–1989)