12.07.2015 Views

Fundamentals of Probability and Statistics for Engineers

Fundamentals of Probability and Statistics for Engineers

Fundamentals of Probability and Statistics for Engineers

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

90 <strong>Fundamentals</strong> <strong>of</strong> <strong>Probability</strong> <strong>and</strong> <strong>Statistics</strong> <strong>for</strong> <strong>Engineers</strong>This result leads immediately to an important generalization. Consider afunction <strong>of</strong> X <strong>and</strong> Y in the <strong>for</strong>m g(X)h(Y ) <strong>for</strong> which an expectation exists.Then, if X <strong>and</strong> Y are independent,Efg…X†h…Y†g ˆEfg…X†gEfh…Y†g:…4:28†When the correlation coefficient <strong>of</strong> two r<strong>and</strong>om variables vanishes, we saythey are uncorrelated. It should be carefully pointed out that what we haveshown is that independence implies zero correlation. The converse, however, isnot true. This point is more fully discussed in what follows.The covariance or the correlation coefficient is <strong>of</strong> great importance in theanalysis <strong>of</strong> two r<strong>and</strong>om variables. It is a measure <strong>of</strong> their linear interdependencein the sense that its value is a measure <strong>of</strong> accuracy with which one r<strong>and</strong>omvariable can be approximated by a linear function <strong>of</strong> the other. In order to seethis, let us consider the problem <strong>of</strong> approximating a r<strong>and</strong>om variable X by alinear function <strong>of</strong> a second r<strong>and</strong>om variable Y ,aY ‡ b, where a <strong>and</strong> b arechosen so that the mean-square error e, defined byis minimized. Upon taking partial derivatives <strong>of</strong> e with respect to a <strong>and</strong> b <strong>and</strong>setting them to zero, straight<strong>for</strong>ward calculations show that this minimum isattained when<strong>and</strong>e ˆ Ef‰X …aY ‡ b†Š 2 g; …4:29†a ˆ X Yb ˆ m X am YSubstituting these values into Equation (4.29) then gives 2 X 1 2 ) as theminimum mean-square error. We thus see that an exact fit in the mean-squaresense is achieved when jj ˆ1, <strong>and</strong> the linear approximation is the worst when ˆ 0. More specifically, when ˆ‡ 1, the r<strong>and</strong>om variables X <strong>and</strong> Y are saidto be positively perfectly correlated in the sense that the values they assume fallon a straight line with positive slope; they are negatively perfectly correlatedwhen ˆ 1<strong>and</strong> their values <strong>for</strong>m a straight line with negative slope. Thesetwo extreme cases are illustrated in Figure 4.3. The value <strong>of</strong> jj decreases asscatter about these lines increases.Let us again stress the fact that the correlation coefficient measures only thelinear interdependence between two r<strong>and</strong>om variables. It is by no means ageneral measure <strong>of</strong> interdependence between X <strong>and</strong> Y. Thus, ˆ 0 does notimply independence <strong>of</strong> the r<strong>and</strong>om variables. In fact, Example 4.10 shows, theTLFeBOOK

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!