Shannon's Source Coding Theorem

In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy.

The source coding theorem shows that (in the limit, as the length of a stream of independent and identically-distributed random variable (i.i.d.) data tends to infinity) it is impossible to compress the data such that the code rate (average number of bits per symbol) is less than the Shannon entropy of the source, without it being virtually certain that information will be lost. However it is possible to get the code rate arbitrarily close to the Shannon entropy, with negligible probability of loss.

The source coding theorem for symbol codes places an upper and a lower bound on the minimal possible expected length of codewords as a function of the entropy of the input word (which is viewed as a random variable) and of the size of the target alphabet.

Read more about Shannon's Source Coding Theorem:  Statements, Proof: Source Coding Theorem, Proof: Source Coding Theorem For Symbol Codes

Famous quotes containing the words source and/or theorem:

    By recognizing a favorable opinion of yourself, and taking pleasure in it, you in a measure give yourself and your peace of mind into the keeping of another, of whose attitude you can never be certain. You have a new source of doubt and apprehension.
    Charles Horton Cooley (1864–1929)

    To insure the adoration of a theorem for any length of time, faith is not enough, a police force is needed as well.
    Albert Camus (1913–1960)