< Day Day Up > |
2.3 CHANNEL CAPACITY
Shannon introduced the concept of channel capacity, the limit at which data can be transmitted through a medium. The errors in the transmission medium depend on the energy of the signal, the energy of the noise, and the bandwidth of the channel. Conceptually, if the bandwidth is high, we can pump more data in the channel. If the signal energy is high, the effect of noise is reduced. According to Shannon, the bandwidth of the channel and signal energy and noise energy are related by the formula
where
C is channel capacity in bits per second (bps)
W is bandwidth of the channel in Hz
S/N is the signal-to-noise power ratio (SNR). SNR generally is measured in dB using the formula
The value of the channel capacity obtained using this formula is the theoretical maximum. As an example, consider a voice-grade line for which W = 3100Hz, SNR = 30dB (i.e., the signal-to-noise ratio is 1000:1)
So, we cannot transmit data at a rate faster than this value in a voice-grade line.
An important point to be noted is that in the above formula, Shannon assumes only thermal noise.
To increase C, can we increase W? No, because increasing W increases noise as well, and SNR will be reduced. To increase C, can we increase SNR? No, that results in more noise, called intermodulation noise.
The entropy of information source and channel capacity are two important concepts, based on which Shannon proposed his theorems.
The bandwidth of the channel, signal energy, and noise energy are related by the formula C = W log2(1 + S/N) bps where C is the channel capacity, W is the bandwidth, and S/N is the signal-to-noise ratio.
No comments:
Post a Comment