12.01.2015 Views

RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

REGRESSION ANALYSIS 541<br />

Box 24.35<br />

Significance level in multiple regression analysis<br />

ANOVA b<br />

Model Sum of squares df Mean square F Sig.<br />

1 Regression 7969.607 3 2656.536 643.116 0.000 a<br />

Residual 190.013 46 4.131<br />

Total 8159.620 49<br />

Chapter 24<br />

a. Predictors: (Constant), Level of interest in the subject, Intelligence, Hours of study<br />

b. Dependent variable: Level of achievement<br />

Box 24.36<br />

The beta coefficients in a multiple regression analysis<br />

Coefficients a<br />

Unstandardized coefficients<br />

Standardized coefficients<br />

Model B SE Beta t Sig.<br />

1 (Constant) 21.304 10.675 1.996 0.052<br />

Hours of study 9.637 1.863 0.920 5.173 0.000<br />

Intelligence −6.20E-02 0.133 −0.062 −0.466 0.644<br />

Level of interest in the<br />

subject<br />

0.116 0.135 0.131 0.858 0.395<br />

a. Dependent variable: Level of achievement<br />

<br />

The only independent variable that has a<br />

statistically significant effect on the level of<br />

achievement is ‘hours of study’.<br />

So, for example, with this knowledge, if we knew<br />

the hours of study, the IQ and the level of<br />

measured interest of a student, we could predict<br />

his or her expected level of achievement in the<br />

examination.<br />

Multiple regression is useful in that it can<br />

take in a range of variables and enable us to<br />

calculate their relative weightings on a dependent<br />

variable. However, one has to be cautious:<br />

variables may interact with each other and may be<br />

intercorrelated (the issue of multicollinearity), for<br />

example Gorard (2001) suggests that<br />

poverty and ethnicity are likely to have some<br />

correlation between themselves, so using both<br />

together means that we end up using their common<br />

variance twice. If collinearity is discovered (e.g. if<br />

correlation coefficients between variables are higher<br />

than .80) then one can either remove one of the<br />

variables or create a new variable that combines the<br />

previous two that were highly intercorrelated.<br />

(Gorard 2001: 172)<br />

Indeed SPSS will automatically remove variables<br />

where there is strong covariance (collinearity). 2<br />

In reporting multiple regression, in addition to<br />

presenting tables (often of SPSS output), one can<br />

use a form of words thus, for example:<br />

Multiple regression was used, and the results include<br />

the adjusted R square (0.975), ANOVA (ρ

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!