13.07.2015 Views

Mathematical Methods for Physics and Engineering - Matematica.NET

Mathematical Methods for Physics and Engineering - Matematica.NET

Mathematical Methods for Physics and Engineering - Matematica.NET

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

PROBABILITY◮Show that if X <strong>and</strong> Y are independent r<strong>and</strong>om variables then E[XY ]=E[X]E[Y ].LetusconsiderthecasewhereX <strong>and</strong> Y are continuous r<strong>and</strong>om variables. Since X <strong>and</strong>Y are independent f(x, y) =f X (x)f Y (y), so thatE[XY ]=∫ ∞ ∫ ∞xyf X (x)f Y (y) dx dy =∫ ∞xf X (x) dx∫ ∞−∞ −∞−∞−∞An analogous proof exists <strong>for</strong> the discrete case. ◭yf Y (y) dy = E[X]E[Y ].30.12.2 VariancesThe definitions of the variances of X <strong>and</strong> Y are analogous to those <strong>for</strong> thesingle-variable case (30.48), i.e. the variance of X is given by{∑ ∑V [X] =σX 2 = i j (x i − µ X ) 2 f(x i ,y j ) <strong>for</strong> the discrete case,∫ ∞ ∫ ∞−∞ −∞ (x − µ X) 2 f(x, y) dx dy <strong>for</strong> the continuous case. (30.132)Equivalent definitions exist <strong>for</strong> the variance of Y .30.12.3 Covariance <strong>and</strong> correlationMeans <strong>and</strong> variances of joint distributions provide useful in<strong>for</strong>mation abouttheir marginal distributions, but we have not yet given any indication of how tomeasure the relationship between the two r<strong>and</strong>om variables. Of course, it maybe that the two r<strong>and</strong>om variables are independent, but often this is not so. Forexample, if we measure the heights <strong>and</strong> weights of a sample of people we wouldnot be surprised to find a tendency <strong>for</strong> tall people to be heavier than short people<strong>and</strong> vice versa. We will show in this section that two functions, the covariance<strong>and</strong> the correlation, can be defined <strong>for</strong> a bivariate distribution <strong>and</strong> that these areuseful in characterising the relationship between the two r<strong>and</strong>om variables.The covariance of two r<strong>and</strong>om variables X <strong>and</strong> Y is defined byCov[X,Y ]=E[(X − µ X )(Y − µ Y )], (30.133)where µ X <strong>and</strong> µ Y are the expectation values of X <strong>and</strong> Y respectively. Clearlyrelated to the covariance is the correlation of the two r<strong>and</strong>om variables, definedbyCorr[X,Y ]= Cov[X,Y ] , (30.134)σ X σ Ywhere σ X <strong>and</strong> σ Y are the st<strong>and</strong>ard deviations of X <strong>and</strong> Y respectively. It can beshown that the correlation function lies between −1 <strong>and</strong> +1. If the value assumedis negative, X <strong>and</strong> Y are said to be negatively correlated, if it is positive they aresaid to be positively correlated <strong>and</strong>ifitiszerotheyaresaidtobeuncorrelated.We will now justify the use of these terms.1200

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!