13.07.2015 Views

Mathematical Methods for Physics and Engineering - Matematica.NET

Mathematical Methods for Physics and Engineering - Matematica.NET

Mathematical Methods for Physics and Engineering - Matematica.NET

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

30.12 PROPERTIES OF JOINT DISTRIBUTIONSOne particularly useful consequence of its definition is that the covarianceof two independent variables, X <strong>and</strong> Y , is zero. It immediately follows from(30.134) that their correlation is also zero, <strong>and</strong> this justifies the use of the term‘uncorrelated’ <strong>for</strong> two such variables. To show this extremely important propertywe first note thatCov[X,Y ]=E[(X − µ X )(Y − µ Y )]= E[XY − µ X Y − µ Y X + µ X µ Y ]= E[XY ] − µ X E[Y ] − µ Y E[X]+µ X µ Y= E[XY ] − µ X µ Y . (30.135)Now, if X <strong>and</strong> Y are independent then E[XY ]=E[X]E[Y ]=µ X µ Y <strong>and</strong> soCov[X,Y ] = 0. It is important to note that the converse of this result is notnecessarily true; two variables dependent on each other can still be uncorrelated.In other words, it is possible (<strong>and</strong> not uncommon) <strong>for</strong> two variables X <strong>and</strong> Yto be described by a joint distribution f(x, y) thatcannot be factorised into aproduct of the <strong>for</strong>m g(x)h(y), but <strong>for</strong> which Corr[X,Y ] = 0. Indeed, from thedefinition (30.133), we see that <strong>for</strong> any joint distribution f(x, y) that is symmetricin x about µ X (or similarly in y) we have Corr[X,Y ]=0.We have already asserted that if the correlation of two r<strong>and</strong>om variables ispositive (negative) they are said to be positively (negatively) correlated. We havealso stated that the correlation lies between −1 <strong>and</strong> +1. The terminology suggeststhat if the two RVs are identical (i.e. X = Y ) then they are completely correlated<strong>and</strong> that their correlation should be +1. Likewise, if X = −Y then the functionsare completely anticorrelated <strong>and</strong> their correlation should be −1. Values of thecorrelation function between these extremes show the existence of some degreeof correlation. In fact it is not necessary that X = Y <strong>for</strong> Corr[X,Y ] = 1; it issufficient that Y is a linear function of X, i.e.Y = aX + b (with a positive). If ais negative then Corr[X,Y ]=−1. To show this we first note that µ Y = aµ X + b.NowY = aX + b = aX + µ Y − aµ X ⇒ Y − µ Y = a(X − µ X ),<strong>and</strong> so using the definition of the covariance (30.133)Cov[X,Y ]=aE[(X − µ X ) 2 ]=aσX.2It follows from the properties of the variance (subsection 30.5.3) that σ Y = |a|σ X<strong>and</strong> so, using the definition (30.134) of the correlation,Corr[X,Y ]= aσ2 X|a|σX2 = a|a| ,which is the stated result.It should be noted that, even if the possibilities of X <strong>and</strong> Y being non-zero aremutually exclusive, Corr[X,Y ] need not have value ±1.1201

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!