06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Problems 801

13.4-5 A cascade of two channels is shown in Fig. Pl3.4-5. The symbols at the source, at the output of

the first channel, and at the output of the second channel are denoted by x, y. and z. Show that

and

H (xlz) ::0: H (xly)

J(x; y) ::O: J(x; z)

This shows that the information that can be transmitted over a cascaded channel can be no greater

than that transmitted over one link. In effect, information channels tend to leak information.

Hint: For a cascaded channel, observe that

Hence, by Bayes' rule,

Figure

P. 13.4-5

X

y

z

13.5-1 For a continuous random variable x constrained to a peak magnitude M (-M < x < M),

show that the entropy is maximum when x is uniformly distributed in the range (-M,M)

and has zero probability density outside this range. Show that the maximum entropy is given

by log 2M .

13.5-2 For a continuous random variable x constrained to only positive values O < x < oo and a mean

value A, show that the entropy is maximum when

Show that the corresponding entropy is

H (x) = log eA

13.5-3 A television transmission requires 30 frames of 300,000 picture elements each to be transmitted

per second. Use the data in Prob. 13.1-2 to estimate the theoretical bandwidth of the AWGN

channel if the SNR at the receiver is required to be at least 50 dB.

13.7-1 ln a communication system over a frequency-selective channel with transfer function

the input signal PSD is

H ) (f

= I+ jn(f /200)

The channel noise is AWGN with spectrum Sn (f) = I o- 2 . Find the mutual information between

the channel input and the channel output.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!