12.07.2015 Views

Fundamentals of Probability and Statistics for Engineers

Fundamentals of Probability and Statistics for Engineers

Fundamentals of Probability and Statistics for Engineers

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Linear Models <strong>and</strong> Linear Regression 337ye i(x i ,y i )Estimated regression line:∧ ∧y = α + βxTrue regression line:y = α + βxxFigure 11.1The least squares method <strong>of</strong> estimationThe least-square estimates ^ <strong>and</strong> ^ , respectively, <strong>of</strong> <strong>and</strong> are found byminimizingQ ˆ Xne 2 i ˆ Xn‰y i …^‡^x i †Š 2 : …11:6†iˆ1 iˆ1In the above, the sample-value pairs are (x 1 ,y 1 ), (x 2 ,y 2 ), . . . , (x n ,y n ), <strong>and</strong>e i ,i ˆ 1,2,...,n, are called the residuals. Figure 11.1 gives a graphical presentation<strong>of</strong> this procedure. We see that the residuals are the vertical distancesbetween the observed values <strong>of</strong> Y ,y i , <strong>and</strong> the least-square estimate ^ ‡ ^x <strong>of</strong>true regression line ‡ x.The estimates ^ <strong>and</strong> are easily found based on the least-square procedure.The results are stated below as Theorem 11.1.Theorem 11.1: consider the simple linear regression model defined byEquation (11.4). Let (x 1 ,y 1 ), (x 2 ,y 2 ),...,(x n ,y n ) be observed sample values <strong>of</strong> Ywith associated values <strong>of</strong> x. Then the least-square estimates <strong>of</strong> <strong>and</strong> are^ ˆ y ^x; …11:7†" #" #^ ˆXnX n1…x i x†…y i y† …x i x† 2 ; …11:8†iˆ1iˆ1TLFeBOOK

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!