06.06.2022 Views

B. P. Lathi, Zhi Ding - Modern Digital and Analog Communication Systems-Oxford University Press (2009)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

434 FUNDAMENTALS OF PROBABILITY THEORY

I .

E = 0. Hence, E = 0 and

- 4m p

2 P e (2 2n - 1)

0'2

= E2 = _____ _

E 3 ( 22n )

(8.70b)

Variance of a Sum of Independent Random Variables

The variance of a sum of independent RVs is equal to the sum of their variances. Thus, if x

and y are independent RVs and

then

z=x+y

()'2 = ()'2 + ()' 2

Z X y

(8.71)

This can be shown as follows:

a;: = (z - z) 2 = [x + y - (x + y)] 2

= [(x - x) + (y - y)] 2

= (x - x) 2 + (y - y) 2 + 2(x - x)(y - y)

=a;+ a; + 2(x - x)(y - y)

Because x and y are independent RVs, (x - x) and (y - y) are also independent RVs. Hence,

from Eq. (8.64b) we have

But

(x - x)(y - y) = (x - x) • (y - y)

Similarly,

and

a 2 = a 2 +a 2

Z X y

This result can be extended to any number of variables. If RVs x and y both have zero means

(i.e., x = y = 0), then z = x + y = 0. Also, because the variance equals the mean square value

when the mean is zero, it follows that

z 2 = (x + y) 2 = x 2 + y 2

(8.72)

provided x = y = 0, and provided x and y are independent RVs.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!