21.07.2015 Views

Computer-based and paper-based writing assessment: a ...

Computer-based and paper-based writing assessment: a ...

Computer-based and paper-based writing assessment: a ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

6 | CAMBRIDGE ESOL : RESEARCH NOTES : ISSUE 34 / NOVEMBER 2008• Errors: lexical error rates in texts produced in both modeswere similar although the nature of the errors differed.This study reveals that informal letter texts composed byh<strong>and</strong> or by computer under timed <strong>assessment</strong> conditionsdo show some variation in linguistic <strong>and</strong> text features.The findings from this study provide useful insights intothe comparability of PB <strong>and</strong> CB written output within an<strong>assessment</strong> context. The use of live data show that findingsare not an artefact of an artificial or trial situation <strong>and</strong> thusresults have an immediate <strong>and</strong> direct relevance to testing.It should be noted that the samples used in the studywere limited by those that took the CB test, which currentlyhas a smaller c<strong>and</strong>idature than the PB test. As a result onlyc<strong>and</strong>idates from three countries were studied. The main L1sused in these countries use a roman alphabet, thus thefindings may not necessarily be the same for those using anon-roman alphabet.In addition the study involved two separate samples ofc<strong>and</strong>idates <strong>and</strong> while the two groups were matched onlanguage ability it could be that the results were affected bythe nature of the samples themselves. The availability ofmore c<strong>and</strong>idate information such as age, gender <strong>and</strong> L1would have lent weight to the matching of the sample.The results from this study have a number of implicationsfor teachers, testers <strong>and</strong> raters. For teachers, issues ofcapitalisation, punctuation <strong>and</strong> paragraphing can behighlighted. If teachers are made aware of areas in whichc<strong>and</strong>idates are having problems then these can beaddressed at classroom level. For language testers thedifferences found between the texts written in differentmodes can be built into rater training. This is both for theinterpretation of mark schemes <strong>and</strong> for dealing with thepresentation effect. For example if text organisation interms of paragraphing is to be assessed then there needsto be an awareness that this may differ depending on themode in which the text was produced.This study can be considered a starting point intoresearch into the text differences between PB <strong>and</strong> CBwritten output. It would be interesting to explore the effectfound with different L1 groups, those with different <strong>writing</strong>systems (e.g. Cyrillic, Arabic) <strong>and</strong> those at differentproficiency levels. An exploration of how these featuresdiffer across modes in other genres would also beworthwhile. Other text features could also be analysed,for example structure, cohesion <strong>and</strong> larger lexical chunks.More in-depth exploration of the differences found wouldgive further insights enabling us to see how, for example,the lexical breadth differs between groups, i.e. what typesof words are used differently.Investigation of the <strong>writing</strong> process using protocol studieswould add insight into the composition process <strong>and</strong>perhaps shed light on why differences <strong>and</strong> similaritiesbetween the texts are found. There would also be value inexploring how the differences found affect the ratingprocess, for example how examiners respond to textspresented differently in terms of paragraphing or thenumber <strong>and</strong> length of sentences. An underst<strong>and</strong>ing of anyeffect could feed into rater training to enhance the fairness<strong>and</strong> reliability of written <strong>assessment</strong>.References <strong>and</strong> further readingAlderson, J C (2000) Assessing Reading, Cambridge: CambridgeUniversity Press.Bachman, L F <strong>and</strong> Palmer, A S (1996) Language Testing in Practice,Oxford: Oxford University Press.Goldberg, A, Russell, M <strong>and</strong> Cook, A (2003) The effect of computerson student <strong>writing</strong>: a meta-analysis of studies from 1992–2002,Journal of Technology, Learning <strong>and</strong> Assessment, 2/1, retrievedfrom www.jtla.orgHorkay, N, Bennett, R E, Allen, N, Kaplan, B <strong>and</strong> Yan, F (2006) Does itmatter if I take my <strong>writing</strong> test on computer? An empirical study ofmode effects in NAEP, Journal of Technology, Learning <strong>and</strong>Assessment, 5/2, retrieved from www.jtla.orgLee, H K (2004) A comparative study of ESL writers’ performance in a<strong>paper</strong>-<strong>based</strong> <strong>and</strong> a computer-delivered <strong>writing</strong> test, AssessingWriting, 9/1, 4–26.Li, J (2006) The mediation of technology in ESL <strong>writing</strong> <strong>and</strong> itsimplications for <strong>writing</strong> <strong>assessment</strong>, Assessing Writing, 11/1, 5–21.MacCann, R, Eastment, B <strong>and</strong> Pickering S (2002) Responding to freeresponse examination questions: computer versus pen <strong>and</strong> <strong>paper</strong>,British Journal of Educational Technology, 33/2, 173–188.Nation, P <strong>and</strong> Heatley, A (1996) Range, Wellington: School ofLinguistics <strong>and</strong> Applied Language Studies, Victoria University ofWellington, retrieved from www.vuw.ac.nz/lals/staff/paulnation/nation.aspxPowers, D, Fowles, M, Farnum, M, <strong>and</strong> Ramsey, P (1994) Will theythink less of my h<strong>and</strong>written essay if others word process theirs?Effects on essay scores of intermingling h<strong>and</strong>written <strong>and</strong> wordprocessedessays, Journal of Educational Measurement, 31/3,220–233.Russell, M <strong>and</strong> Plati, T (2002) Does it matter what I write? Comparingperformance on <strong>paper</strong>, computer <strong>and</strong> portable <strong>writing</strong> devices,Current Issues in Education, 5/4, retrieved fromhttp://cie.ed.asu.edu/volume5/number4Russell, M <strong>and</strong> Tao, W (2004a) Effects of h<strong>and</strong><strong>writing</strong> <strong>and</strong> computerprinton composition scores: a follow-up to Powers, Fowles,Farnum <strong>and</strong> Ramsey, Practical Assessment <strong>and</strong> Research Evaluation9/1, retrieved from http://pareonline.net/getvn.asp?v=9&n=1— (2004b) The influence of computer-print on rater scores, PracticalAssessment <strong>and</strong> Research Evaluation 9/10, retrieved fromhttp://pareonline.net/getvn.asp?v=9&n=10Scott, M (1998) WordSmith Tools, Version 3.0, Oxford: OxfordUniversity Press.Warschauer, M (2003) Technology <strong>and</strong> second language <strong>writing</strong>:Researching a moving target, in Kei Matsuda, P, Canagarajah, A S,Harklau, L, Hyl<strong>and</strong>, K <strong>and</strong> Warschauer, M, Changing currents insecond language <strong>writing</strong> research: A colloquium, in Journal ofSecond Language Writing, 12/2, 151–179.Weigle, S C (2002) Assessing Writing, Cambridge: CambridgeUniversity Press.Yates, S J (1996) Oral <strong>and</strong> written linguistic aspects of computerconferencing: A corpus-<strong>based</strong> study, in Herring, S C (Ed.)<strong>Computer</strong>-mediated communication: Linguistic, social <strong>and</strong>cross-cultural perspectives, Philadelphia: John BenjaminsPublishing Co.This is an extract from Research Notes, published quarterly by University of Cambridge ESOL Examinations.Previous issues are available online from www.CambridgeESOL.org©UCLES 2008 – The contents of this publication may not be reproduced without the written permission of the copyright holder.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!