06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

762 INTRODUCTION TO INFORMATION THEORY

knowledge of one sample gives no information about any other sample. Hence, the entropy of

2B Nyquist samples is the sum of the entropies of the 2B samples, and

H'(n) = Blog (2neNB) bit/s (13.43b)

where H' (n) is the entropy per second of n(t).

From the results derived thus far, we can draw one significant conclusion. Among all

signals band-limited to B Hz and constrained to have a certain mean square value o- 2 , the

white Gaussian band-limited signal has the largest entropy per second. To understand the

reason for this, recall that for a given mean square value, Gaussian samples have the largest

entropy; moreover, all the 2B samples of a Gaussian band-limited process are independent.

Hence, the entropy per second is the sum of the entropies of all the 2B samples. In processes

that are not white, the Nyquist samples are correlated, and, hence, the entropy per

second is less than the sum of the entropies of the 2B samples. If the signal is not Gaussian,

then its samples are not Gaussian, and, hence, the entropy per sample is also less than the

maximum possible entropy for a given mean square value. To reiterate, for a class of bandlimited

signals constrained to a certain mean square value, the white Gaussian signal has

the largest entropy per second, or the largest amount of uncertainty. This is also the reason

why white Gaussian noise is the worst possible noise in terms of interference with signal

transmission.

Mutual Information /(x; y)

The ultimate test of any concept is its usefulness. We shall now show that the relative entropy

defined in Eqs. (13.32) does lead to meaningful results when we consider J(x; y), the mutual

information of continuous random variables x and y. We wish to transmit a random variable

x over a channel. Each value of x in a given continuous range is now a message that may be

transmitted, for example, as a pulse of height x. The message recovered by the receiver will

be a continuous random variable y. If the channel were noise free, the received value y would

uniquely determine the transmitted value x. But channel noise introduces a certain uncertainty

about the true value of x. Consider the event that at the transmitter, a value of x in the interval

(x, x + b.x) has been transmitted (b.x ---+ 0). The probability of this event is p(x)b.x in the

limit ,6.x ---+ 0. Hence, the amount of information transmitted is log [l/p(x)b.x]. Let the value

of y at the receiver be y and let p(xly) be the conditional probability density of x when y = y.

Thenp(xly),6.x is the probability that x will lie in the interval (x, x+ b.x) when y = y (provided

,6.x ---+ 0). Obviously, there is an uncertainty about the event that x lies in the interval (x,

x + ,6.x). This uncertainty, log [1/p(xly),6.x], arises because of channel noise and therefore

represents a loss of information. Because log [1/p(x),6.x] is the information transmitted and

log [l / p(x IY) b.x] is the information lost over the channel, the net information received is I (x; y)

given by

1

1

l(x;y) = log [ --] - log [ ]

p(x)b.x p(x/y)b.x

= log p(xly)

p(x)

(13.44)

Note that this relation is true in the limit ,6.x ---+ 0. Therefore, J(x; y), represents the information

transmitted over a channel if we receive y (y = y) when x is transmitted (x = x). We

are interested in finding the average information transmitted over a channel when some x is

transmitted and a certain y is received. We must therefore average I (x; y) over all values of

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!