27.10.2014 Views

Russel-Research-Method-in-Anthropology

Russel-Research-Method-in-Anthropology

Russel-Research-Method-in-Anthropology

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Multivariate Analysis 661<br />

You can also work out and test a model of how several <strong>in</strong>dependent variables<br />

<strong>in</strong>fluence a dependent variable all at once. This is the task of multiple regression,<br />

which we’ll take up next. (For more on partial correlation, see Gujarati<br />

2003.)<br />

Multiple Regression<br />

Partial correlation tells us how much a third (or fourth . . .) variable contributes<br />

to the relation between two variables. Multiple regression puts all the<br />

<strong>in</strong>formation about a series of variables together <strong>in</strong>to a s<strong>in</strong>gle equation that<br />

takes account of the <strong>in</strong>terrelationships among <strong>in</strong>dependent variables. The<br />

result of multiple regression is a statistic called multiple-R, which is the comb<strong>in</strong>ed<br />

correlation of a set of <strong>in</strong>dependent variables with the dependent variable,<br />

tak<strong>in</strong>g <strong>in</strong>to account the fact that each of the <strong>in</strong>dependent variables might be<br />

correlated with each of the other <strong>in</strong>dependent variables.<br />

What’s really <strong>in</strong>terest<strong>in</strong>g is R 2 . Recall from chapter 20 that r 2 —the square<br />

of the Pearson product moment correlation coefficient—is the amount of variance<br />

<strong>in</strong> the dependent variable accounted for by the <strong>in</strong>dependent variable <strong>in</strong> a<br />

simple regression. R 2 ,ormultiple-R squared, is the amount of variance <strong>in</strong><br />

the dependent variable accounted for by two or more <strong>in</strong>dependent variables<br />

simultaneously.<br />

Now, if the predictors of a dependent variable were all uncorrelated with<br />

each other, we could just add together the pieces of the variance <strong>in</strong> the dependent<br />

variable accounted for by each of the <strong>in</strong>dependent variables. That is, it<br />

would be nice if R 2 r 12 r 22 r 32 ...<br />

It’s a real nuisance, but <strong>in</strong>dependent variables are correlated with one<br />

another. (This <strong>in</strong>terdependence among <strong>in</strong>dependent variables is called multicoll<strong>in</strong>earity,<br />

which we’ll discuss <strong>in</strong> depth later.) What we need is a method<br />

for figur<strong>in</strong>g out how much variance <strong>in</strong> a dependent variable is accounted for<br />

by a series of <strong>in</strong>dependent variables after tak<strong>in</strong>g <strong>in</strong>to account all of the overlap<br />

<strong>in</strong> variances accounted for across the <strong>in</strong>dependent variables. That’s what multiple<br />

regression does.<br />

The Multiple Regression Equation<br />

We covered basic regression <strong>in</strong> chapter 20, but just to br<strong>in</strong>g you back up to<br />

speed, remember that <strong>in</strong> simple regression we use an equation that expresses<br />

how an <strong>in</strong>dependent variable is related to a dependent variable. On the lefthand<br />

side of the equation, we have the unknown score for y, the dependent<br />

variable. On the right-hand side, we have the y-<strong>in</strong>tercept, called a. It’sthe

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!