21.01.2022 Views

Statistics for the Behavioral Sciences by Frederick J. Gravetter, Larry B. Wallnau ISBN 10: 1305504917 ISBN 13: 9781305504912

Statistics is one of the most practical and essential courses that you will take, and a primary goal of this popular text is to make the task of learning statistics as simple as possible. Straightforward instruction, built-in learning aids, and real-world examples have made STATISTICS FOR THE BEHAVIORAL SCIENCES, 10th Edition the text selected most often by instructors for their students in the behavioral and social sciences. The authors provide a conceptual context that makes it easier to learn formulas and procedures, explaining why procedures were developed and when they should be used. This text will also instill the basic principles of objectivity and logic that are essential for science and valuable in everyday life, making it a useful reference long after you complete the course.

Statistics is one of the most practical and essential courses that you will take, and a primary goal of this popular text is to make the task of learning statistics as simple as possible. Straightforward instruction, built-in learning aids, and real-world examples have made STATISTICS FOR THE BEHAVIORAL SCIENCES, 10th Edition the text selected most often by instructors for their students in the behavioral and social sciences. The authors provide a conceptual context that makes it easier to learn formulas and procedures, explaining why procedures were developed and when they should be used. This text will also instill the basic principles of objectivity and logic that are essential for science and valuable in everyday life, making it a useful reference long after you complete the course.

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

SECTION 16.3 | Introduction to Multiple Regression with Two Predictor Variables 551

Therefore, the additional contribution made by adding X 2

to the regression equation can be

computed as

(% with both X 1

and X 2

) − (% with X 1

alone)

= 55.62% − 52.26%

= 3.36%

Because SS Y

= 90, the additional variability from adding X 2

as a predictor amounts to

SS additional

= 3.36% of 90 = 0.0336(90) = 3.024

This SS value has df = 1, and can be used to compute an F-ratio evaluating the significance

of the contribution of X 2

. First,

MS additional

= SS additional

1

5 3.024

1

= 3.024

This MS value is evaluated by computing an F-ratio with the MS residual

value from the

multiple regression as the denominator. (Note: This is the same denominator that was used

in the F-ratio to evaluate the significance of the multiple-regression equation.) For these

data, we obtain

F = MS additional

MS residual

5 3.024

5.71 5 0.5296

With df = 1, 7, this F-ratio is not significant. Therefore, we conclude that adding X 2

to the

regression equation does not significantly improve the prediction compared to using X 1

as

a single predictor. The computer printout shown in Figure 16.8, reports a t statistic instead

of an F-ratio to evaluate the contribution for each predictor variable. Each t value is simply

the square root of the F-ratio and is reported in the right-hand side of the Coefficients table.

Variable X 2

, for example, is reported as VAR00003 in the table and has t = 0.732, which is

within rounding error of the F-ratio we obtained; ÏF = Ï0.5296 = 0.728.

■ Multiple Regression and Partial Correlations

In Chapter 15 we introduced partial correlation as a technique for measuring the relationship

between two variables while eliminating the influence of a third variable. At that time,

we noted that partial correlations serve two general purposes:

1. A partial correlation can demonstrate that an apparent relationship between two

variables is actually caused by a third variable. Thus, there is no direct relationship

between the original two variables.

2. Partial correlation can demonstrate that there is a relationship between two variables

even after a third variable is controlled. Thus, there really is a relationship between

the original two variables that is not being caused by a third variable.

Multiple regression provides an alternative procedure for accomplishing both of these

goals. Specifically, the regression analysis evaluates the contribution of each predictor

variable after the influence of the other predictor has been considered. Thus, you can determine

whether each predictor variable contributes to the relationship by itself or simply

duplicates the contribution already made by another variable.

For example, in Chapter 15 (p. 503) we presented data for a sample of 15 cities showing

a logical relationship among the number of churches, the number of crimes, and the

population size. Although the original data showed a strong, positive correlation between

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!