24.11.2014 Views

An investigation of the process of writing IELTS Academic Reading ...

An investigation of the process of writing IELTS Academic Reading ...

An investigation of the process of writing IELTS Academic Reading ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>An</strong>thony Green and Roger Hawkey<br />

She found that in comparison to novices, more expert item writers, those producing more positively<br />

evaluated texts and items that met <strong>the</strong> requirements <strong>of</strong> <strong>the</strong> test developers (UK examining boards<br />

<strong>of</strong>fering tests <strong>of</strong> English as a Foreign Language):<br />

! are more aware <strong>of</strong> <strong>the</strong> test specifications and are quickly able to recognise texts that<br />

show potential as test material. Where novices tended to devise a listening script<br />

from a source text first and <strong>the</strong>n to write <strong>the</strong> questions, experts were more inclined<br />

to start from <strong>the</strong> questions and <strong>the</strong>n to build a script to fit with <strong>the</strong>se<br />

! are more aware <strong>of</strong> <strong>the</strong> needs <strong>of</strong> candidates for clear contextual information and are<br />

better able to provide accessible contextualising information in <strong>the</strong> form <strong>of</strong> short,<br />

accessible rubrics and co-text<br />

! explore a range <strong>of</strong> possible task ideas ra<strong>the</strong>r than committing immediately to one<br />

that might later prove to unworkable<br />

! use many more learned rules or ruses than non-experts including, for example:<br />

! exchanging words in <strong>the</strong> text and in <strong>the</strong> question so that <strong>the</strong> hypernym appears in<br />

<strong>the</strong> text<br />

! adding additional text to <strong>the</strong> script to introduce distraction and reduce <strong>the</strong><br />

susceptibility <strong>of</strong> <strong>the</strong> questions to guessing strategies<br />

Although more experienced item writers tended to outperform <strong>the</strong> recently trained, expertise was not<br />

simply a function <strong>of</strong> experience. One writer with no previous experience <strong>of</strong> test item <strong>writing</strong><br />

performed better in <strong>the</strong> judgement <strong>of</strong> a review panel than two item writers with extensive experience<br />

(Salisbury 2005). Salisbury also concludes that expertise in Listening test item <strong>writing</strong> is collective in<br />

nature. Individual writers rarely have sufficient capability to meet institutional requirements at <strong>the</strong> first<br />

attempt and need <strong>the</strong> feedback <strong>the</strong>y receive from <strong>the</strong>ir colleagues to achieve a successful outcome. It<br />

might be added that item writer expertise itself is not sufficient to guarantee test quality. Even where<br />

items are subject to rigorous review, piloting usually reveals fur<strong>the</strong>r deficiencies <strong>of</strong> measurement.<br />

The Cambridge ESOL approach to test development is described in detail by Saville (2003) and by<br />

Khalifa and Weir (2009). The <strong>IELTS</strong> test production <strong>process</strong> for <strong>the</strong> reading and listening papers is<br />

outlined in a document available from <strong>the</strong> <strong>IELTS</strong> website, www.ielts.org. The goal <strong>of</strong> this test<br />

production <strong>process</strong> is that ‘each test [will be] suitable for <strong>the</strong> test purpose in terms <strong>of</strong> topics, focus,<br />

level <strong>of</strong> language, length, style and technical measurement properties’ (<strong>IELTS</strong> 2007, 1).<br />

<strong>IELTS</strong> test material is written by freelance item writers externally commissioned by Cambridge ESOL<br />

in a <strong>process</strong> centrally managed from Cambridge and carried out according to confidential test<br />

specifications or item writer guidelines laid down by <strong>the</strong> test developers (although see Clapham 1996a,<br />

1996b for an account <strong>of</strong> <strong>the</strong> role <strong>of</strong> externally commissioned item <strong>writing</strong> teams in developing <strong>the</strong><br />

<strong>IELTS</strong> academic reading module). These guidelines, periodically modified to reflect feedback from<br />

item writers and o<strong>the</strong>r stakeholders, detail <strong>the</strong> characteristics <strong>of</strong> <strong>the</strong> <strong>IELTS</strong> modules (speaking,<br />

listening and academic or general training reading and <strong>writing</strong>), set out <strong>the</strong> requirements for<br />

commissions and guide writers in how to approach <strong>the</strong> item <strong>writing</strong> <strong>process</strong>. The guidelines cover <strong>the</strong><br />

steps <strong>of</strong> selecting appropriate material, developing suitable items and submitting material. However, a<br />

good deal <strong>of</strong> <strong>the</strong> responsibility for test content is devolved to <strong>the</strong> externally commissioned workers<br />

including <strong>the</strong> item writers and <strong>the</strong>ir team leaders or chairs for each <strong>of</strong> <strong>the</strong> modules. Khalifa and Weir<br />

(2009) describe <strong>the</strong> chair as having responsibility for <strong>the</strong> technical aspects <strong>of</strong> item <strong>writing</strong> and for<br />

ensuring that item writers on <strong>the</strong>ir team are fully equipped to generate material <strong>of</strong> <strong>the</strong> highest quality.<br />

<strong>IELTS</strong> Research Reports Volume 11 www.ielts.org 6

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!