12.01.2015 Views

RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok

RESEARCH METHOD COHEN ok

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

290 EXPERIMENTS AND META-ANALYSIS<br />

used method of investigation, bringing together<br />

different studies to provide evidence to inform<br />

policy-making and planning. Meta-analysis<br />

is a research strategy in itself. That this is<br />

happening significantly is demonstrated in the<br />

establishment of the EPPI-Centre (Evidence<br />

for Policy and Practice Information and Coordinating<br />

Centre) at the University of London<br />

(http://eppi.ioe.ac.uk/EPPIWeb/home.aspx),<br />

the Social, Psychological, Educational and Criminological<br />

Controlled Trials Register (SPECTR),<br />

later transferred to the Campbell Collaboration<br />

(http://www.campbellcollaboration.org), a<br />

parallel to the Cochrane Collaboration in<br />

medicine (http://www.cochrane.org/index0.htm),<br />

which undertakes systematic reviews and metaanalyses<br />

of, typically, experimental evidence in<br />

medicine, and the Curriculum, Evaluation and<br />

Management (CEM) centre at the University of<br />

Durham (http://www.cemcentre.org). ‘Evidence’<br />

here typically comes from randomized controlled<br />

trials of one hue or another (Tymms 1999; Coe<br />

et al.2000;ThomasandPring2004:95),withtheir<br />

emphasis on careful sampling, control of variables,<br />

both extraneous and included, and measurements<br />

of effect size. The cumulative evidence from collected<br />

RCTs is intended to provide a reliable body<br />

of knowledge on which to base policy and practice<br />

(Coe et al. 2000).Suchaccumulateddata,it<br />

is claimed, deliver evidence of ‘what works’, although<br />

Morrison (2001b) suggests that this claim<br />

is suspect.<br />

The roots of evidence-based practice lie<br />

in medicine, where the advocacy by Cochrane<br />

(1972) for randomized controlled trials together<br />

with their systematic review and documentation<br />

led to the foundation of the Cochrane<br />

Collaboration (Maynard and Chalmers 1997),<br />

which is now worldwide. The careful, quantitativebased<br />

research studies that can contribute to the<br />

accretion of an evidential base is seen to be a<br />

powerful counter to the often untried and undertested<br />

schemes that are injected into practice.<br />

More recently evidence-based education has<br />

entered the worlds of social policy, social work<br />

(MacDonald 1997) and education (Fitz-Gibbon<br />

1997). At the forefront of educational research<br />

in this area are Fitz-Gibbon (1996; 1997; 1999)<br />

and Tymms (1996), who, at the Curriculum,<br />

Evaluation and Management Centre at the<br />

University of Durham, have established one of the<br />

world’s largest monitoring centres in education.<br />

Fitz-Gibbon’s work is critical of multilevel<br />

modelling and, instead, suggests how indicator<br />

systems can be used with experimental methods<br />

to provide clear evidence of causality and a ready<br />

answer to her own question, ‘How do we know<br />

what works’ (Fitz-Gibbon 1999: 33).<br />

Echoing Anderson and Biddle (1991), Fitz-<br />

Gibbon suggests that policy-makers shun evidence<br />

in the development of policy and that practitioners,<br />

in the hurly-burly of everyday activity, call upon<br />

tacit knowledge rather than the knowledge which<br />

is derived from RCTs. However, in a compelling<br />

argument (Fitz-Gibbon 1997: 35–6), she suggests<br />

that evidence-based approaches are necessary in<br />

order to challenge the imposition of unproven<br />

practices, solve problems and avoid harmful<br />

procedures, and create improvement that leads<br />

to more effective learning. Further, such evidence,<br />

she contends, should examine effect sizes rather<br />

than statistical significance.<br />

While the nature of information in evidencebased<br />

education might be contested by researchers<br />

whose sympathies (for whatever reason) lie outside<br />

randomized controlled trials, the message from<br />

Fitz-Gibbon will not go away: the educational<br />

community needs evidence on which to base<br />

its judgements and actions. The development<br />

of indicator systems worldwide attests to the<br />

importance of this, be it through assessment and<br />

examination data, inspection findings, national<br />

and international comparisons of achievement,<br />

or target setting. Rather than being a shot<br />

in the dark, evidence-based education suggests<br />

that policy formation should be informed, and<br />

policy decision-making should be based on the<br />

best information to date rather than on hunch,<br />

ideology or political will. It is bordering on the<br />

unethical to implement untried and untested<br />

recommendations in educational practice, just<br />

as it is unethical to use untested products and<br />

procedures on hospital patients without their<br />

consent.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!