06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

8 .2 Random Variables 41 1

Similarly,

Px(Xi) = L Px y

(Xi, YJ )

j

(8.23b)

The probabilities P x (x;) and P y

(y J

) are called marginal probabilities. Equations (8.23) show

how to determine marginal probabilities from joint probabilities. Results ofEqs. (8.20) through

(8.23) can be extended to more than two RVs.

Example 8. 1 3 A binary symmetric channel (BSC) error probability is P e· The probability of transmitting 1

is Q, and that of transmitting O is 1 - Q (Fig. 8.7). Determine the probabilities of receiving 1

and O at the receiver.

Figure 8.7

Binary symmetric

channel (BSC).

1 - P e

x = l --------➔ y = l

(Q)

x = I __ ___ _ ___ ....., y = o

( 1 - Q) 1 - P e

I

u

If x and y are the transmitted digit and the received digit, respectively, then for a BSC,

P y 1x(Oll) = P y 1x(llO) = P e

P y 1x (OIO) = P y 1x(lll) = 1 - P e

Also,

and Px (O) = 1 -Q

We need to find P y

(l) and P y

(O). From the total probability theorem,

we find

Similarly,

P y

(l) = Px (O)Py1x (llO) + Px (l)P y 1x (lll)

= (1 - Q)P e + Q(l - P e )

P y

(O) = (1 - Q)(l - P e )+ QP e

I

These answers seem almost obvious from Fig. 8.7.

Note that because of channel errors, the probability of receiving a digit 1 is not the

same as that of transmitting 1. The same is true of 0.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!