06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

756 INTRODUCTION TO INFORMATION THEORY

13.5 CHANNEL CAPACITY OF A CONTINUOUS

MEMORYLESS CHANNEL

For a discrete random variable x taking on values x1, x2, ... , X n with probabilities P(xi ),

P(x2), ... , P(x n ), the entropy H(x) was defined as

n

H(x) = LP(x;) log P(x;) (13.29)

i=I

For analog data, we have to deal with continuous random variables. Therefore, we must extend

the definition of entropy to continuous random variables. One is tempted to state that H (x) for

continuous random variables is obtained by using the integral instead of discrete summation

in Eq. (13.29)*:

H(x) =

1 00 1

p(x) log -dx

-oo p(x)

(13.30)

We shall see that Eq. (13.30) is indeed the meaningful definition of entropy for a continuous

random variable. We cannot accept this definition, however, unless we show that it has

the meaningful interpretation as uncertainty. A random variable x takes a value in the range

(nx, (n + l)x) with probability p(nx) x in the limit as x ---+ 0. The error in the approximation

will vanish in the limit as x ---+ 0. Hence H(x), the entropy of a continuous random

variable x, is given by

. 1

Ax➔O

n

p(nx) x

H(x) = hm Lp(nx) x log - -- -

= lim [Lp(nx)x log - 1 - - LP(nx) x log x]

Ax➔O

n p(nx) n

1 00 1

-oo

1 00 1

= p(x) log - dx

p(x)

lim log x

1 00 p(x) dx

Ax➔O -x,

= p(x) log -- dx - lim log x

-oo p(x) Ax➔O

(13.31)

In the limit as x ---+ 0, log x ---+ -oo. It therefore appears that the entropy of a continuous

random variable is infinite. This is quite true. The magnitude of uncertainty associated with

a continuous random variable is infinite. This fact is also apparent intuitively. A continuous

random variable assumes an uncountable infinite number of values, and, hence, the uncertainty

is on the order of infinity. Does this mean that there is no meaningful definition of entropy for

a continuous random variable? On the contrary, we shall see that the first term in Eq. (13.3 1)

serves as a meaningful measure of the entropy (average information) of a continuous random

variable x. This may be argued as follows. We can consider .f p(x) log [l/p(x)] dx as a relative

entropy with - log x serving as a datum, or reference. The information transmitted over a

channel is actually the difference between the two terms H(x) and H(x ly). Obviously, if we

have a common datum forboth H(x) and H(xl y), the difference H(x) -H(x ly) will be the same

* Throughout this discussion, the PDF Px (x) will be abbreviated as p(x) ; this practice causes no ambiguity and

improves the clarity of the equations.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!