Channel Capacity
The AWGN channel is represented by a series of outputs at discrete time event index . is the sum of the input and noise, where is independent and identically distributed and drawn from a zero-mean normal distribution with variance (the noise). The are further assumed to not be correlated with the .
The capacity of the channel is infinite unless the noise n is nonzero, and the are sufficiently constrained. The most common constraint on the input is the so-called "power" constraint, requiring that for a codeword transmitted through the channel, we have:
where represents the maximum channel power. Therefore, the channel capacity for the power-constrained channel is given by:
Where is the distribution of . Expand, writing it in terms of the differential entropy:
But and are independent, therefore:
Evaluating the differential entropy of a Gaussian gives:
Because and are independent and their sum gives :
From this bound, we infer from a property of the differential entropy that
Therefore the channel capacity is given by the highest achievable bound on the mutual information:
Where is maximized when:
Thus the channel capacity for the AWGN channel is given by:
Read more about this topic: Additive White Gaussian Noise
Famous quotes containing the words channel and/or capacity:
“Children belong in families, which, ideally, serve as a sanctuary and a cushion from the world at large. Parents belong to society and are a part of that greater world. Sometimes parents are a channel to the larger society, sometimes they are a shield from it. Ideally they act as filters, guiding their children and teaching them to avoid the tempting trash.”
—Louise Hart (20th century)
“Information about child development enhances parents capacity to respond appropriately to their children. Informed parents are better equipped to problem-solve, more confident of their decisions, and more likely to respond sensitively to their childrens developmental needs.”
—L. P. Wandersman (20th century)