21.01.2022 Views

Statistics for the Behavioral Sciences by Frederick J. Gravetter, Larry B. Wallnau ISBN 10: 1305504917 ISBN 13: 9781305504912

Statistics is one of the most practical and essential courses that you will take, and a primary goal of this popular text is to make the task of learning statistics as simple as possible. Straightforward instruction, built-in learning aids, and real-world examples have made STATISTICS FOR THE BEHAVIORAL SCIENCES, 10th Edition the text selected most often by instructors for their students in the behavioral and social sciences. The authors provide a conceptual context that makes it easier to learn formulas and procedures, explaining why procedures were developed and when they should be used. This text will also instill the basic principles of objectivity and logic that are essential for science and valuable in everyday life, making it a useful reference long after you complete the course.

Statistics is one of the most practical and essential courses that you will take, and a primary goal of this popular text is to make the task of learning statistics as simple as possible. Straightforward instruction, built-in learning aids, and real-world examples have made STATISTICS FOR THE BEHAVIORAL SCIENCES, 10th Edition the text selected most often by instructors for their students in the behavioral and social sciences. The authors provide a conceptual context that makes it easier to learn formulas and procedures, explaining why procedures were developed and when they should be used. This text will also instill the basic principles of objectivity and logic that are essential for science and valuable in everyday life, making it a useful reference long after you complete the course.

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

550 CHAPTER 16 | Introduction to Regression

computer printout in Figure 16.8, this summary table is reported in the ANOVA table in

the center,

Source SS df MS F

Regression 50.06 2 25.03 4.38

Residual 39.94 7 5.71

Total 90.00 9

The following example is an opportunity to test your understanding of the standard error of

estimate for a multiple-regression equation.

EXAMPLE 16.7

Data from a sample of n = 15 individuals are used to compute a multiple-regression equation

with two predictor variables. The equation has R 2 = 0.20 and SS Y

= 150.

Find SS residual

and compute the standard error of estimate for the regression equation. You

should obtain SS residual

= 120 and a standard error of estimate equal to Ï10 = 3.16. ■

■ Evaluating the Contribution of Each Predictor Variable

In addition to evaluating the overall significance of the multiple-regression equation,

researchers are often interested in the relative contribution of each of the two predictor

variables. Is one of the predictors responsible for more of the prediction than the other?

Unfortunately, the b values in the regression equation are influenced by a variety of other

factors and do not address this issue. If b 1

is larger than b 2

, it does not necessarily mean that

X 1

is a better predictor than X 2

. However, in the standardized form of the regression equation,

the relative size of the beta values is an indication of the relative contribution of the

two variables. For the data in Table 16.2, the standardized regression equation is

ẑ Y

= (beta 1

)z X1

+ (beta 2

)z X2

= 0.558z X1

+ 0.247z X2

For the SPSS printout

in Figure 16.8, the beta

values are shown in the

Coefficients table.

In this case, the larger beta value for the X 1

predictor indicates that X 1

predicts more of the

variance than does X 2

. The signs of the beta values are also meaningful. In this example,

both betas are positive indicating the both X 1

and X 2

are positively related to Y.

Beyond judging the relative contribution for each of the predictor variables, it also is

possible to evaluate the significance of each contribution. For example, does variable X 2

make a significant contribution beyond what is already predicted by variable X 1

? The null

hypothesis states that the multiple-regression equation (using X 2

in addition to X 1

) is not

any better than the simple regression equation using X 1

as a single predictor variable. An

alternative view of the null hypothesis is that the b 2

(or beta 2

) value in the equation is not

significantly different from zero. To test this hypothesis, we first determine how much more

variance is predicted using X 1

and X 2

together than is predicted using X 1

alone.

Earlier we found that the multiple regression equation with both X 1

and X 2

predicted

R 2 = 55.62% of the variance for the Y scores. To determine how much is predicted by X 1

alone, we begin with the correlation between X 1

and Y, which is

r =

SP X1Y

ÏsSS X1

dsSS Y

d 5 54

Ïs62ds90d 5 54

74.70 = 0.7229

Squaring the correlation produces r 2 = (0.7229) 2 = 0.5226 or 52.26%. This means

that the relationship with X 1

predicts 52.26% of the variance for the Y scores.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!