12.07.2015 Views

Stat 5101 Lecture Notes - School of Statistics

Stat 5101 Lecture Notes - School of Statistics

Stat 5101 Lecture Notes - School of Statistics

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

46 <strong>Stat</strong> <strong>5101</strong> (Geyer) Course <strong>Notes</strong>Note that variance is a special case <strong>of</strong> covariance. When X and Y are thesame random variable, we getcov(X, X) =E { (X−µ X ) 2} = var(X).The covariance <strong>of</strong> a random variable with itself is its variance. This is one reasonwhy covariance is considered a (mixed) second moment (rather than some sort<strong>of</strong> first moment). A more important reason arises in the following section.For some unknown reason, there is no standard Greek-letter notation forcovariance. We can always write σX 2 instead <strong>of</strong> var(X) if we like, but there is nostandard analogous notation for covariance. (Lindgren uses the notation σ X,Yfor cov(X, Y ), but this notation is nonstandard. For one thing, the special caseσ X,X = σX 2 looks weird. For another, no one who has not had a course usingLindgren as the textbook will recognize σ X,Y . Hence it is better not to get inthe habit <strong>of</strong> using the notation.)Variance <strong>of</strong> a Linear CombinationA very important application <strong>of</strong> the covariance concept is the second-orderanalog <strong>of</strong> the linearity property given in Theorem 2.3. Expressions like thea 1 X 1 + ···+a n X n occurring in Theorem 2.3 arise so frequently that it is worthhaving a general term for them. An expression a 1 x 1 + ···a n x n , where the a iare constants and the x i are variables is called a linear combination <strong>of</strong> thesevariables. The same terminology is used when the variables are random. Withthis terminology defined, the question <strong>of</strong> interest in this section can be stated:what can we say about variances and covariances <strong>of</strong> linear combinations?Theorem 2.16. If X 1 , ..., X m and Y 1 , ..., Y n are random variables havingfirst and second moments and a 1 , ..., a m and b 1 , ..., b n are constants, then⎛⎞m∑ n∑m∑ n∑cov ⎝ a i X i , b j Y j⎠ = a i b j cov(X i ,Y j ). (2.21)i=1j=1i=1 j=1Before we prove this important theorem we will look at some corollaries thatare even more important than the theorem itself.Corollary 2.17. If X 1 , ..., X n are random variables having first and secondmoments and a 1 , ..., a n are constants, then( n)∑n∑ n∑var a i X i = a i a j cov(X i ,X j ). (2.22)i=1i=1 j=1Pro<strong>of</strong>. Just take m = n, a i = b i , and X i = Y i in the theorem.Corollary 2.18. If X 1 , ..., X m and Y 1 , ..., Y n are random variables havingfirst and second moments, then⎛⎞m∑ n∑m∑ n∑cov ⎝ X i , Y j⎠ = cov(X i ,Y j ). (2.23)i=1j=1i=1 j=1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!