12.07.2015 Views

Stat 5101 Lecture Notes - School of Statistics

Stat 5101 Lecture Notes - School of Statistics

Stat 5101 Lecture Notes - School of Statistics

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

2.7. INDEPENDENCE 79Then X and Y are uncorrelated (Problem 2-37) but not independent. Independencewould require thatE{g(X)h(Y )} = E{g(X)}E{h(Y )}hold for all functions g and h. But it obviously does not hold when, to pick justone example, g is the squaring function and h is the identity function so g(X) =Y and h(Y )=Y, because no nonconstant random variable is independent <strong>of</strong>itself. 2Problems2-27. Suppose X ∼ Bin(n, p).(a)Show thatE{X(X − 1)} = n(n − 1)p 2Hint: Follow the pattern <strong>of</strong> Example 2.5.1.(b)Show thatHint: Use part (a).var(X) =np(1 − p).2-28. Suppose X ∼ Poi(µ).(a)Show thatE{X(X − 1)} = µ 2Hint: Follow the pattern <strong>of</strong> Example 2.5.2.(b)Show thatHint: Use part (a).var(X) =µ.2-29. Verify the moments <strong>of</strong> the DU(1,n) distribution given in Section B.1.1<strong>of</strong> Appendix B.Hint: First establishn∑k 2 n(n + 1)(2n +1)=6by mathematical induction.k=12 Bizarrely, constant random variables are independent <strong>of</strong> all random variables, includingthemselves. This is just the homogeneity axiom and the “expectation <strong>of</strong> a constant is theconstant” property:E{g(a)h(X)} = g(a)E{h(X)} = E{g(a)}E{h(X)}for any constant a and random variable X.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!