21.01.2022 Views

Statistics for the Behavioral Sciences by Frederick J. Gravetter, Larry B. Wallnau ISBN 10: 1305504917 ISBN 13: 9781305504912

Statistics is one of the most practical and essential courses that you will take, and a primary goal of this popular text is to make the task of learning statistics as simple as possible. Straightforward instruction, built-in learning aids, and real-world examples have made STATISTICS FOR THE BEHAVIORAL SCIENCES, 10th Edition the text selected most often by instructors for their students in the behavioral and social sciences. The authors provide a conceptual context that makes it easier to learn formulas and procedures, explaining why procedures were developed and when they should be used. This text will also instill the basic principles of objectivity and logic that are essential for science and valuable in everyday life, making it a useful reference long after you complete the course.

Statistics is one of the most practical and essential courses that you will take, and a primary goal of this popular text is to make the task of learning statistics as simple as possible. Straightforward instruction, built-in learning aids, and real-world examples have made STATISTICS FOR THE BEHAVIORAL SCIENCES, 10th Edition the text selected most often by instructors for their students in the behavioral and social sciences. The authors provide a conceptual context that makes it easier to learn formulas and procedures, explaining why procedures were developed and when they should be used. This text will also instill the basic principles of objectivity and logic that are essential for science and valuable in everyday life, making it a useful reference long after you complete the course.

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

546 CHAPTER 16 | Introduction to Regression

The goal of the multiple-regression equation is to produce the most accurate estimated

values for Y. As with the single predictor regression, this goal is accomplished with a leastsquared

solution. First, we define “error” as the difference between the predicted Y value

from the regression equation and the actual Y value for each individual. Each error is then

squared to produce uniformly positive values, and then we add the squared errors. Finally,

we calculate values for b 1

, b 2

, and a that produce the smallest possible sum of squared

errors. The derivation of the final values is beyond the scope of this text, but the final equations

are as follows:

b 1

5 sSP X1Y dsSS X2 d 2 sSP X1X2 dsSP X 2Y d

sSS X1

dsSS X2

d 2 sSP X1X2

d 2 (16.16)

b 2

5 sSP X2Y dsSS X1 d 2 sSP X1X2 dsSP X1Y d

sSS X1

dsSS X2

d 2 sSP X1X2

d 2 (16.17)

a = M Y

– b 1

M X1

– b 2

M X2

(16.18)

In these equations, you should recognize the following SS and SP values:

SS X1

is the sum of squared deviations for X 1

SS X2

is the sum of squared deviations for X 2

SP X1Y

is the sum of products of deviations for X 1

and Y

SP X2Y

is the sum of products of deviations for X 2

and Y

SP X1X2

is the sum of products of deviations for X 1

and X 2

Note: More detailed information about the calculation of SS is presented in Chapter 4

(pp. 112–113) and information concerning SP is in Chapter 15 (pp. 490–492). The following

example demonstrates multiple regression with two predictor variables.

EXAMPLE 16.6

We use the data in Table 16.2 to demonstrate multiple regression. Note that each individual

has a Y score and two X scores that are used as predictor variables. Also note that we have

already computed the SS values for Y and for both of the X scores, as well as all of the SP

values. These values are used to compute the coefficients, b 1

and b 2

, and the constant, a, for

the regression equation.

Ŷ = b 1

X 1

– b 2

X 2

+ a

b 1

5 sSP dsSS d 2 sSP dsSP d

X1Y X2 X1X2 X2Y 54(64d 2 42(47)

5 5 0.672

sSS X1

dsSS X2

d 2 sSP X1X2

d 2 62(64) 2 s42d 2

b 2

5 sSP X2Y dsSS X1 d 2 sSP X1X2 dsSP X1Y d

sSS X1

dsSS X2

d 2 sSP X1X2

d 2

5

47s62d 2 42s54d

62s64d 2 s42d 2 5 0.293

a = M Y

– b 1

M X1

– b 2

M X2

= 7 – 0.672(4) – 0.293(6) = 7 – 2.688 – 1.758 = 2.554

Thus, the final regression equation is

Ŷ = 0.672X 1

+ 0.293X 2

+ 2.554

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!