06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

13.5 Channel Capacity of a Continuous Memoryless Channel 757

as the difference between their relative entropies. We are therefore justified in considering the

first term in Eq. (13.31) as the differential entropy of x. We must, however, always remember

that this is a relative entropy and not the absolute entropy. Failure to realize this subtle point

generates many apparent fallacies, one of which will be given in Example 13.4.

Based on this argument, we define H(x), the differential entropy of a continuous random

variable x, as

H (x) = r Xl p(x) log - 1 - d:x

1-oo p(x)

bits

= -L: p(x) logp(x) dx bits

(13.32a)

(13.32b)

Although H (x) is the differential (relative) entropy of x, we shall call it the entropy of random

variable x for brevity.

Example 13 .4 A signal amplitude xis a random variable uniformly distributed in the range (- 1, 1). This signal

is passed through an amplifier of gain 2. The output y is also a random variable, uniformly

distributed in the range (-2, 2). Determine the (differential) entropies H(x) and H(y).

I

We have

I

I

I

Hence,

lxl < 1

P(x)

- { : otherwise

IYI < 2

P)

- { ; otherwise

1 1 1

H (x) =

1

- log 2 d:x = 1 bit

- 1 2

2

1

H (y) = - log 4 dx = 2 bits

-2 4

The entropy of the random variable y is I bit higher than that of x. This result may come

as a surprise, since a knowledge of x uniquely determines y, and vice versa, because y

= 2x. Hence, the average uncertainty of x and y should be identical. Amplification itself

can neither add nor subtract information. Why, then, is H(y) twice as large as H(x)? This

becomes clear when we remember that H (x) and H (y) are differential (relative) entropies,

and they will be equal if and only if their datum (or reference) entropies are equal. The

reference entropy R1 for x is - log !u, and the reference entropy R 2 for y is - log y

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!