Channel Capacity
The AWGN channel is represented by a series of outputs at discrete time event index . is the sum of the input and noise, where is independent and identically distributed and drawn from a zero-mean normal distribution with variance (the noise). The are further assumed to not be correlated with the .
The capacity of the channel is infinite unless the noise n is nonzero, and the are sufficiently constrained. The most common constraint on the input is the so-called "power" constraint, requiring that for a codeword transmitted through the channel, we have:
where represents the maximum channel power. Therefore, the channel capacity for the power-constrained channel is given by:
Where is the distribution of . Expand, writing it in terms of the differential entropy:
But and are independent, therefore:
Evaluating the differential entropy of a Gaussian gives:
Because and are independent and their sum gives :
From this bound, we infer from a property of the differential entropy that
Therefore the channel capacity is given by the highest achievable bound on the mutual information:
Where is maximized when:
Thus the channel capacity for the AWGN channel is given by:
Read more about this topic: Additive White Gaussian Noise
Famous quotes containing the words channel and/or capacity:
“Eddie did not die. He is no longer on Channel 4, and our sets are tuned to Channel 4; hes on Channel 7, but hes still broadcasting. Physical incarnation is highly overrated; it is one corner of universal possibility.”
—Marianne Williamson (b. 1953)
“Information about child development enhances parents capacity to respond appropriately to their children. Informed parents are better equipped to problem-solve, more confident of their decisions, and more likely to respond sensitively to their childrens developmental needs.”
—L. P. Wandersman (20th century)