06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

13.4 Channel Capacity of a Discrete Memoryless Channel 75 1

The quantity H(ylx) is the conditional entropy of y given x and is the average uncertainty

about the received symbol when the transmitted symbol is known. Equation (13.21b) can be

rew1itten as

H(x) - H(x ly) = H(y) - H(y lx) (13.2 1c)

From Eq. (13 .20d) it is clear that/ ( x; y) is a function of the transmitted symbol probabilities

P(xi) and the channel matrix. For a given channel, J(x; y) will be maximum for some set of

probabilities P(x;). This maximum value is the channel capacity C s ,

C s = max J(x; y)

P(x;)

bits per symbol

(13.22)

Thus, because we have allowed the channel input to choose any symbol probabilities P(x;),

C s represents the maximum information that can be transmitted by one symbol over the

channel. These ideas will become clear from the following example of a binary symmetric

channel (BSC).

Example 1 3.3 Find the channel capacity of the BSC shown in Fig. 13.2.

Fi g ure 13.2

Binary symmetric

channel.

Let P(xi) = a and P(x2) = a = (1 - a). Also,

P(Y1 lx2) = P(yzlxi) = P e

P(y1 lx1) = P(y2lx2) = Pe = I - P e

Substitution of these probabilities into Eq. (13.20d) gives

- ( Pe ) ( Pe )

I(x;y) =aP e log - _ + aP e log _ _

aP e + aPe

aPe + aPe

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!