Lectures on Elementary Probability
Lectures on Elementary Probability
Lectures on Elementary Probability
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
3.5. INDEPENDENT RANDOM VARIABLES 21<br />
3.5 Independent random variables<br />
C<strong>on</strong>sider discrete random variables X and Y . They are said to be independent<br />
if the events X = x, Y = y are independent for all possible values x, y. That<br />
is, X, Y are independent if<br />
P [X = x, Y = y] = P [X = x][P [Y = y] (3.27)<br />
for all possible values x, y of the two random variables.<br />
There is of course an appropriate generalizati<strong>on</strong> of the noti<strong>on</strong> of independence<br />
to more than two random variables. Again there is a multiplicati<strong>on</strong> rule,<br />
but this time with more than two factors.<br />
Theorem 3.10 If X and Y are independent random variables, and f and g<br />
are arbitrary functi<strong>on</strong>s, then the random variables f(X) and g(Y ) satisfy the<br />
multiplicati<strong>on</strong> property<br />
E[f(X)g(Y )] = E[f(X)]E[g(Y )]. (3.28)<br />
This theorem has an important c<strong>on</strong>sequence: Independent random variables<br />
are uncorrelated. It immediately follows that for independent random variables<br />
the variances add.<br />
Theorem 3.11 If X and Y are independent random variables, then X and Y<br />
are uncorrelated random variables.<br />
Proof: If X and Y are independent, then from the previous theorem<br />
E[(X − µ X )(Y − µ Y )] = E[X − µ X ]E[Y − µ Y ] = 0 · 0 = 0. (3.29)