15.01.2015 Views

Lectures on Elementary Probability

Lectures on Elementary Probability

Lectures on Elementary Probability

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapter 7<br />

The weak law of large<br />

numbers<br />

7.1 Independent implies uncorrelated<br />

We begin by recalling the fact that independent random variables are uncorrelated<br />

random variables. We begin with the definiti<strong>on</strong>s in a particularly c<strong>on</strong>venient<br />

form.<br />

Random variables X, Y are independent if for every pair of functi<strong>on</strong>s f, g<br />

we have<br />

E[f(X)g(X)] = E[f(X)]E[g(X)]. (7.1)<br />

Random variables X, Y are uncorrelated if<br />

E[XY ] = E[X]E[Y ]. (7.2)<br />

This follows from the fact that the covariance of X, Y may be written in the<br />

form Cov(X, Y ) = E[XY ] − E[X]E[Y ].<br />

The following theorem comes from taking the functi<strong>on</strong>s f(x) = x and g(y) =<br />

y in the definiti<strong>on</strong> of independence.<br />

Theorem 7.1 If X and Y are independent, then X and Y are uncorrelated.<br />

The c<strong>on</strong>verse is false. Uncorrelated does not imply independent.<br />

7.2 The sample mean<br />

Theorem 7.2 Let X 1 , X 2 , X 3 , . . . , X n be random variables, each with mean µ.<br />

Let<br />

¯X n = X 1 + · · · + X n<br />

(7.3)<br />

n<br />

be their sample mean. Then the expectati<strong>on</strong> of ¯Xn is<br />

E[ ¯X n ] = µ. (7.4)<br />

45

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!