20.02.2015 Views

The Interactive Whiteboards, Pedagogy and Pupil Performance ...

The Interactive Whiteboards, Pedagogy and Pupil Performance ...

The Interactive Whiteboards, Pedagogy and Pupil Performance ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

8.1.2. Difference-in-Differences (School Fixed Effects)<br />

By using data from before <strong>and</strong> after the installation of IWBs as part of London<br />

Challenge, we can compare changes in pupil outcomes for departments that<br />

experience large increases in IWB availability, <strong>and</strong> departments that did not. <strong>The</strong><br />

advantages of this methodology are that any constant factors, observable or<br />

unobservable, that may affect overall school performance are eliminated. We do<br />

require the assumption that any unmeasured time-varying factors affect departments<br />

with a large increase in IWB installation (we can think of these as the treatment<br />

group) as much as they do departments with little increase in IWB installation (these<br />

could be thought of as the control group of schools).<br />

This approach estimates the effect of the installation of IWBs by exploiting different<br />

growths in IWB availability between October 2003 <strong>and</strong> October 2004 across schools<br />

in the sample. Consider the following equation of school achievement in a single<br />

subject:<br />

yst<br />

= γ<br />

s<br />

+ λt<br />

+ M<br />

st<br />

β + ε<br />

(1)<br />

st<br />

where y st is the average achievement in school s in period t; γ s is a time-invariant<br />

effect of school s on pupil outcomes; λ t the school-invariant trend in pupil outcomes<br />

<strong>and</strong> β is the estimate of the effect of the level of IWB installations, M st , in the<br />

department at school s at time t.<br />

If we assume that trends in achievement would have been the same in the absence<br />

of our treatment (the increase in IWB installations in some departments) then we can<br />

exploit between school differences in installation patterns to estimate the effect on<br />

pupil outcomes:<br />

∆yst<br />

= ( λ<br />

t<br />

− λt<br />

) + ( M<br />

st<br />

− M<br />

st<br />

) β + ε<br />

(2)<br />

−1 −1<br />

st<br />

This basic first difference equation sets up the principle behind the difference-indifferences<br />

(DID) approach, used throughout this paper to estimate the effect of IWB<br />

installation on pupil outcomes. However, because we might be concerned that<br />

changes in the pupil populations at schools is confounding estimates, a pupil level<br />

model is estimated, even though the intervention is a school-level treatment. We do<br />

not have observations for pupils at both points in time (2003/04 <strong>and</strong> 2004/05), so we<br />

use school fixed effects rather than a first difference approach. Expected pupil<br />

outcomes are conditional on a vector of pupil level characteristics X ist , including prior<br />

attainment. All school characteristics <strong>and</strong> inputs are assumed be time-invariant,<br />

except for the treatment M st <strong>and</strong> the year group composition Z st :<br />

yist<br />

= λ<br />

t<br />

+ M<br />

st<br />

β + Z<br />

st<br />

χ + X<br />

istδ<br />

+ γ<br />

s<br />

+ ε<br />

(3)<br />

ist<br />

<strong>The</strong> equivalent model in a multi-level framework is a r<strong>and</strong>om effects model, which<br />

parameterises the distribution of the school effects by assuming they are normally<br />

distributed. <strong>The</strong> equivalent equation is:<br />

yist<br />

= λ<br />

t<br />

+ M<br />

st<br />

β + Z<br />

st<br />

χ + X<br />

istδ<br />

+ uist<br />

, where uist<br />

= η<br />

st<br />

+ ε<br />

(4)<br />

ist<br />

In certain circumstances this would be the more efficient approach. However, our<br />

sample of schools is quite small here, so the normality of school effects assumption<br />

might not be valid. We test the validity of the r<strong>and</strong>om effects (multi-level) estimates<br />

by comparing them to the fixed effects model in equation (3) using a Hausman<br />

specification test. <strong>The</strong> results indicate that the r<strong>and</strong>om effects approach is<br />

inappropriate for the English regressions <strong>and</strong> so is not reported.<br />

64

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!