Application of Shannon Hartley Law

In an additive white Gaussian noise channel , the channel outpuut  y is given by 

y = x+n 

where x is channel input and n is additive band limited white Gaussian noise with 0 mean and  variance  \[\sigma ^2\] 

Capacity \[ C_s\] of an AWGN signal is given by 

\[ C_s\]= Max I(x;y)= \[\frac{1}{2}log_{2}(1+\frac{S}{N})\]  in bits per sample

\[ \frac{S}{N}\] is signal to noise ratio at the channel output .

If the channel Bandwidth B is fixed , then output y(t) is also a bandlimited completely charaterised  by its periodic sample values taken at nyquist rate 2B  samples per second. Then capacity C(b/s) of AWGN channel is given by 

  C = \[ 2BC_S\]=\[Blog_{2}(1+\frac{S}{N})\]  bits /second

This is known as Shannon Hartley Law

Hence it is easier to increase the information capacity of a continuous  communication channel by expanding its bandwidth than by increasing the transmitted power for a prescribed noise variance 

Problem . An analog signal having 4kHz bandwidth is sampled 1.25 times the Nyquist rate and each smple is quantised into one of 256 equally likely levels. Asssume that successive samples are statistically independent.

(a) what is information rate of the source?

(b) can the output of the source be transmitted without error over an AWGN channel with a bandwidth of 10 KHz and S/N ratio of 20 dB

solution :  sampling frequency \[f_s=2f_m=8kHz\]

information rate R= n\[f_s\]= 80 k samples per second

(b) \[C= Blog_{2}(1+\frac{S}{N})\]=\[ 10^4log_{2}(1+10^2)= 66.6(10^3) b/s\]

since information rate > channel capacity  hence error free transmission is not possible

Posted on by