Channel Capacity

In electrical engineering, computer science and information theory, channel capacity is the tightest upper bound on the amount of information that can be reliably transmitted over a communications channel. By the noisy-channel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.

Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution.

Read more about Channel Capacity:  Formal Definition, Noisy-channel Coding Theorem, Example Application, Channel Capacity in Wireless Communications

Famous quotes containing the words channel and/or capacity:

    There may sometimes be ungenerous attempts to keep a young man down; and they will succeed too, if he allows his mind to be diverted from its true channel to brood over the attempted injury. Cast about, and see if this feeling has not injured every person you have ever known to fall into it.
    Abraham Lincoln (1809–1865)

    The legacies that parents and church and teachers left to my generation of Black children were priceless but not material: a living faith reflected in daily service, the discipline of hard work and stick-to-itiveness, and a capacity to struggle in the face of adversity.
    Marian Wright Edelman (20th century)