12.07.2015 Views

Spelling and grammar checkers: are they intrusive? - PGCE

Spelling and grammar checkers: are they intrusive? - PGCE

Spelling and grammar checkers: are they intrusive? - PGCE

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Spelling</strong> <strong>and</strong> <strong>grammar</strong> <strong>checkers</strong> 723Table 1: Demographic information reported as means or percentagesFreshmen English majors Graduate studentsAge 20.25 (2.25) 23.33 (2.83) 29.75 (6.33)DSPT 115.4 (7.0) 117.7 (2.5) 117.7 (2.5)Language skills moderately or 52 100 100more important to c<strong>are</strong>erNumber of different writing6.4 (2.0) 7.6 (2.0) 7.6 (2.0)formats attemptedNumber of writing formats3.2 (2.5) 4.0 (2.0) 4.0 (2.0)commonly usedContent revision skill moderate 60 90 100or greaterSurface feature revision skill60 90 100moderate or greaterModerately to strongly agree52 85 75that misspellings detract fromthe presentation of textSometimes to always use a52 35 50dictionary during writingSometimes to always use thespell checker during writing68 60 70SDs <strong>are</strong> set in p<strong>are</strong>ntheses.DSPT, Diagnostic <strong>Spelling</strong> Potential Test.checker <strong>and</strong> found that students’ spelling correction rates on both controlled materials<strong>and</strong> their own compositions improved from baseline to maintenance. Thus, althoughcorrection rates of spell <strong>checkers</strong> suggest that <strong>they</strong> <strong>are</strong> not always reliable (eg, Pedler,2001), <strong>they</strong> may still support the proofreading task.We examined the editing approaches of freshmen-, junior- <strong>and</strong> senior-year Englishmajors <strong>and</strong> graduate students when using the <strong>checkers</strong> versus using a dictionary (seeTable 1 for the mean age of each student group). The participants revised two shortessays, each rigged with content <strong>and</strong> spelling <strong>and</strong> punctuation errors (see Table 2 forexamples). The students revised one text on a word processor with access to the <strong>checkers</strong><strong>and</strong> the other on a word processor with access to a hardcover dictionary. Previously,researchers have had students revise text on paper in the dictionary condition(Daiute, 1986; Gupta, 1998; Langone et al, 1996; Lewis et al, 1998). However,because we only wanted any changes in the organisation of revisions to be attributableto the checker or dictionary, we had students revise text on the word processor in bothconditions. Both the technology condition <strong>and</strong> the presentation order of the texts werecounterbalanced.We selected our groups to represent a continuum of experience in writing <strong>and</strong> revising.We expected that the English majors <strong>and</strong> graduate students would make more contenterrorrevisions than the freshmen. We also predicted that spelling <strong>and</strong> punctuation© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.


<strong>Spelling</strong> <strong>and</strong> <strong>grammar</strong> <strong>checkers</strong> 725students rated both their content-revision skill ( χ 2 = 18.3, p < 0.05) <strong>and</strong> their surfacefeature-revision skill ( χ 2 = 24.0, p < 0.02) significantly higher than undergraduatestudents did. The groups also differed on the degree to which <strong>they</strong> felt misspellingsdetracted from the presentation of text (χ 2 = 21.0, p < 0.01). There were no groupdifferences for the frequencies with which the participants reported using a dictionaryor spell checker during writing.MaterialsRevision taskTo generalise our findings, we used two essays on different topics. An undergraduatestudent wrote the original essays for a freshman English course. The topics were MaryShelley’s Frankenstein (1818/1994) <strong>and</strong> Edgar Allan Poe’s The Cask of Amontillado(1846/1966). We shortened the essay length to approximately 1.5 pages each, withfour paragraphs, 23 sentences <strong>and</strong> 484 words in the Frankenstein essay, <strong>and</strong> threeparagraphs, 24 sentences <strong>and</strong> 489 words in The Cask of Amontillado essay. We createdtwo versions of each essay by rigging them with content <strong>and</strong> surface errors.We included two types of content errors: inconsistencies in logic <strong>and</strong> improper sentenceorder. Inconsistencies in logic were sentence-level errors that consisted of inappropriatewording that violated the sequential logic or internal consistency (Baker, 1985) of thetext. Improper sentence order consisted of two adjacent sentences being placed inreverse order. We chose these content errors at the level of sentences so that we couldensure that the participants were reading the essays.Surface errors were errors in spelling <strong>and</strong> punctuation. We inserted spelling errors intothe text wherever the target word made both meaningful <strong>and</strong> grammatical sense. Themisspellings were all low-frequency words (Francis & Kudera, 1982). We created punctuationerrors by replacing the correct punctuation with a commonly confused alternative(eg, replacing a comma with a semicolon).To avoid confounding error stimuli with essay topic, two essay versions were createdfor each topic. In each essay version, we planted different sets of error stimuli. Each setconsisted of two inconsistencies in logic, one improper sentence order, six spelling errors<strong>and</strong> three punctuation errors. Thus, we planted one set of errors to create the firstversions of the Frankenstein <strong>and</strong> Amontillado essays, <strong>and</strong> we planted a different set oferrors to create the second versions.For the checker condition, the spelling <strong>and</strong> <strong>grammar</strong> <strong>checkers</strong> were turned on, <strong>and</strong> thespelling <strong>and</strong> punctuation errors were flagged in the text by red <strong>and</strong> green squiggly linesrespectively. Because the correctness of the suggestions offered needed to be consistentacross error sets, we modified the correction options offered by the <strong>checkers</strong>. For thespelling errors, we created a word list of potential suggestions for spelling corrections.If the participant clicked on a flagged error, a pull-down menu would appear with threepotential choices. The correct choice <strong>and</strong> two foils were placed in different positions inthe pull-down menu (eg, retrospect, retrespect <strong>and</strong> retrispect were supplied for the error,© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.


726 British Journal of Educational Technology Vol 37 No 5 2006retraspect). Because the correct spelling was always listed in the pull-down menu, thetwo foils were created to be as phonologically <strong>and</strong> orthographically similar to the correctspelling to maintain task difficulty. For the punctuation errors, if the participant clickedon a flagged error, a pull-down menu would appear with two choices—the appropriatecorrection or an option to ignore (eg, no comma <strong>and</strong> ignore were provided for thecomma error, When Montresor, <strong>and</strong> Fortunato reach the catacombs...).QuestionnaireTo assess educational background <strong>and</strong> writing experience, we modified a questionnaireon academic skills (LeFevre, Kulak & Heymans, 1992). We included questions aboutspelling skill <strong>and</strong> use of the <strong>checkers</strong> during revision. We also asked students to describetheir revision habits.<strong>Spelling</strong> testTo assess spelling skill, we used the spelling subtest of the DSPT (Arena, 1982). Thescores were st<strong>and</strong>ardised according to age. The publisher-reported reliability coefficientof the age range tested was 0.91.ProcedureWe tested the participants individually in one 45-minute session. For the revision task,the participant was shown one version of one of the essays on a word processor. Theparticipant was told to revise the text to create the best possible draft within a period of15 minutes. Thus, the participants were free to make any revisions that <strong>they</strong> wanted,including spontaneous revisions to the content of the essay. Then, we gave each participanta version of the second essay, which contains different errors. The order of essaypresentation was counterbalanced. The participant had the <strong>checkers</strong> turned on for oneessay, <strong>and</strong> <strong>they</strong> had access to a dictionary for the other essay. The order of access to the<strong>checkers</strong> or dictionary was counterbalanced.The participants were instructed to describe orally their revisions as <strong>they</strong> revised thetext, <strong>and</strong> these descriptions were tape-recorded. For each revision, the participantsexplained what <strong>they</strong> were changing, <strong>and</strong> the experimenter recorded the category (eg,content or surface) <strong>and</strong> type of revision (eg, correcting a spelling error). After theparticipants had completed the revision tasks, <strong>they</strong> were administered the spelling test,followed by the questionnaire.ResultsOur dependent measures were percent of properly corrected (1) inconsistencies in logic(n = 2), (2) improper sentence order (n = 1), (3) spelling (n = 6) <strong>and</strong> (4) punctuationerrors (n = 3). In addition to our contrived content errors, we also analysed spontaneouscontent revisions using three dependent measures: (1) number of changes made to thecontent (ie, either adding or revising text), (2) number of movements in text (ie, cutting<strong>and</strong> pasting segments of text) <strong>and</strong> (3) number of deletions made (ie, deleting segmentsof text). There were no significant differences between any of the correction rates acrosstext or version, so we collapsed across text <strong>and</strong> version in the analyses below.© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.


<strong>Spelling</strong> <strong>and</strong> <strong>grammar</strong> <strong>checkers</strong> 727Effect of technologyTable 3 shows the mean percent of properly corrected errors <strong>and</strong> the number ofspontaneous content revisions as a function of technology condition (spelling <strong>and</strong><strong>grammar</strong> <strong>checkers</strong> vs. dictionary). We found no significant main effect of technologyconditions for the content errors or the spontaneous content revisions. We obtained asignificant main effect of technology conditions for the correction rates of spellingerror, F(1, 62) = 51.87, p < 0.01, MSE = 1.73, ω 2 = 0.44; <strong>and</strong> punctuation error,F(1, 62) = 7.40, p < 0.01, MSE = 0.56, ω 2 = 0.09. Students corrected significantly moreof these errors in the checker than in the dictionary condition. <strong>Spelling</strong> ability, asmeasured by the DSPT, was significantly correlated with correction rate for the dictionarycondition, r(63) = 0.30, p < 0.05; but not for the checker condition, r(63) = 0.14.Effect of experienceTable 4 shows the mean percent of properly corrected errors <strong>and</strong> the number ofspontaneous content revisions as a function of student group (freshmen, English majorsTable 3: Mean percent of corrected errors <strong>and</strong> number of spontaneouscontent revisions in the dictionary <strong>and</strong> checker conditionsDictionaryCheckerInconsistencies in logic 42.5 (37.5) 43.0 (42.5)Improper sentence order 71.0 (46.0) 75.0 (43.0)<strong>Spelling</strong> errors 59.5 (29.3) 88.2 (18.3)*Punctuation errors 51.3 (34.3) 64.0 (35.7)*Changes to content 5.1 (3.7) 4.7 (3.3)Movements 0.4 (0.8) 0.3 (0.7)Deletions 4.2 (3.2) 3.8 (2.8)SDs <strong>are</strong> set in p<strong>are</strong>ntheses.*p < 0.01.Table 4: Mean percent of corrected errors <strong>and</strong> number of spontaneous content revisions asa function of experienceFreshmen English majors Graduate studentsInconsistencies in logic 22.0 (35.3) 53.8 (38.5) 57.5 (37.3)**Improper sentence order 56.0 (50.5) 82.5 (39.0) 85.0 (37.0)*<strong>Spelling</strong> errors 65.7 (22.6) 81.3 (18.3) 76.7 (27.0)**Punctuation errors 51.3 (31.8) 61.7 (36.2) 61.7 (36.8)Changes to content 3.2 (2.6) 6.1 (3.1) 5.8 (4.0)**Movements 0.2 (0.6) 0.4 (0.5) 0.6 (1.0)Deletions 3.2 (2.7) 4.4 (2.5) 4.6 (3.5)SDs <strong>are</strong> set in p<strong>are</strong>ntheses.*p < 0.02; **p < 0.01.© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.


728 British Journal of Educational Technology Vol 37 No 5 2006<strong>and</strong> graduate students). We obtained significant main effects of student group for thecorrection rates of inconsistencies in logic, F(2, 62) = 10.43, p < 0.01, MSE = 0.67,ω 2 = 0.23; improper sentence orders, F(2, 62) = 4.43, p < 0.02, MSE = 0.27, ω 2 = 0.09;<strong>and</strong> spelling errors, F(2, 62) = 4.71, p < 0.01, MSE = 2.24, ω 2 = 0.10. Scheffé’s posthoccomparisons showed that English majors <strong>and</strong> graduate students correctedsignificantly more of these three types of errors than undergraduate students did. Wefound no significant main effect of student group for the correction rate of punctuationerrors.We obtained a significant main effect of student group for the number of changes tocontent, F(2, 62) = 6.43, p < 0.01, MSE = 17.57, ω 2 = 0.14. Scheffé’s comparisonsshowed that English majors <strong>and</strong> graduate students made significantly more changes tocontent than undergraduate students did. We found no significant main effect of studentgroup for the number of movements or deletions.Order in the revision taskUsing tape recordings <strong>and</strong> the experimenter’s record of the participants’ descriptions oftheir revisions during the task, we classified each revision as being a content or a surfacerevision. For example, if the participant said, ‘Now I’m going to change the a in retraspectto an o because I know it’s spelled wrong’, this description was coded as a surfacerevision. For each participant, we created a written list of the revisions made in orderfrom the beginning to the end of the task period. Thus, the list began with the firstrevision made <strong>and</strong> ended with the last revision made during the task.If the list indicated that the participant went back <strong>and</strong> forth between making revisionsfor content <strong>and</strong> surface, then the order was coded as interspersing content <strong>and</strong> surfacerevisions. However, if the list indicated a separation of the two revision types (eg, in alist of 12 revisions made, the first nine were surface revisions, <strong>and</strong> the last three werecontent revisions), then the order was coded as separating content <strong>and</strong> surface revisions.Two raters coded the orders, <strong>and</strong> the interrater reliability was 0.92. Disagreementswere resolved by discussion among the raters.The most frequent order imposed on revisions was interspersing content <strong>and</strong> surfacerevisions in the dictionary (86%) <strong>and</strong> checker (77%) conditions. Similar frequencieswere found for each student group. The majority of students (82%) used the same orderof revisions in the dictionary <strong>and</strong> checker conditions; a McNemar change test wasnonsignificant.In the questionnaire, the participants reported on the order that <strong>they</strong> typically use tomanage revision goals in their own writing. The most frequently reported order wasinterspersing content <strong>and</strong> surface revisions (34%). McNemar change tests revealedsignificant differences for the participants’ reports of their own revision order <strong>and</strong>actual order observed in the dictionary ( χ 2 = 26.3, p < 0.05) <strong>and</strong> checker conditions( χ 2 = 16.7, p < 0.05) respectively.© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.


<strong>Spelling</strong> <strong>and</strong> <strong>grammar</strong> <strong>checkers</strong> 729In the questionnaire, the participants also reported on the order that <strong>they</strong> were taughtto be most efficient when revising text. The most frequently reported taught order forrevision was interspersing content <strong>and</strong> surface revisions (46%). Again, McNemar testsrevealed significant differences between reported taught order of revisions <strong>and</strong> actualorder of revisions in either the dictionary ( χ 2 = 16.7, p < 0.05) or the checker ( χ 2 = 8.0,p < 0.05) condition. There was no significant difference between the participants’reports of their own revision order <strong>and</strong> taught order of revisions.DiscussionDifferences across technology conditionsIn this study, we investigated the use of the dictionary or the <strong>checkers</strong> when revising anunfamiliar text. We found that content revisions to our contrived errors as well asspontaneous content revisions were not significantly different across technology conditions,but all student groups were able to correct more surface errors with the aid ofthe <strong>checkers</strong> than <strong>they</strong> were with the dictionary. Our results support research comparingthe usefulness of these tools (Daiute, 1986; Gupta, 1998; Langone et al, 1996; Lewiset al, 1998).We found a correlation between spelling ability <strong>and</strong> correction in the dictionary but notin the checker condition. We conclude that the main source of difficulty for students inthe dictionary condition was in detecting the surface errors without the help of the<strong>checkers</strong>. The detection step in the revision process can be particularly difficult forstudents (Figueredo & Varnhagen, 2004; Flower et al, 1986). Thus, our study underscoresthe usefulness of the <strong>checkers</strong>.Differences between student groupsWe found that both English majors <strong>and</strong> graduate students corrected significantly morecontent errors <strong>and</strong> made more spontaneous changes to content than did freshmen. Thisresult confirms that students with a greater amount of writing experience would be ableto make more content revisions. Previous studies have found similar results (eg, Fitzgerald,1987; Hayes & Flower, 1987).We also found that both English majors <strong>and</strong> graduate students corrected significantlymore spelling errors than did freshmen. This difference in correction rates was not dueto differences in spelling ability across groups. However, English majors <strong>and</strong> graduatestudents rated their surface revision skill significantly higher than undergraduate studentsdid. Thus, although the freshmen were similar to the English majors <strong>and</strong> graduatestudents in terms of spelling ability, <strong>they</strong> were not as vigilant with proofreading the text.Further, having spelling knowledge is necessary but not sufficient for error detection(Figueredo & Varnhagen, 2004).Order in the revision taskWe found that most of the students did not change the order that <strong>they</strong> imposed on therevision task across technology conditions. We conclude that the presence of the <strong>checkers</strong>did not dramatically affect the order in which students made revisions. Previous© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.


730 British Journal of Educational Technology Vol 37 No 5 2006research has also found that word processors generally do not inhibit the revisionprocess (eg, Daiute, 1986).Fourteen students did change the order in which <strong>they</strong> made revisions when the <strong>checkers</strong>were accessible, however. Specifically, most of these students tended to make surfacerevisions first, followed by revisions to content. This occurrence is intuitively reasonable,because the flagged words or highlighted portions of the text made by the <strong>checkers</strong>may be distracting to some students. However, the students’ performance on the contentrevisions did not differ across technology conditions. Thus, while the <strong>checkers</strong> mayhave altered the order imposed by some students, <strong>they</strong> did not interfere with contentrevisions.The order in which the students revised the unfamiliar text for content <strong>and</strong> surfacefeatures was not related to their reports on how <strong>they</strong> typically impose order on revisionin their own writing. Several differences between the context of our experimental task<strong>and</strong> the context of revising one’s own writing should be noted. First, while the texts thatwe used were unfamiliar to the students in our experiment, the students were familiarwith their own writing. Second, the students’ knowledge about the topics in our riggedessays may not be as great as their topic knowledge in their own writing. Third, studieshave shown that students’ performance when revising their peers’ writing comp<strong>are</strong>dwith revising their own writing can be significantly different (eg, Butterfield et al, 1996;Hull, 1987). Fourth, in our experiment, we did not instruct the students to revise thetext for a specific audience (eg, professor well versed in classic English literature, highschool student, etc), but writers may frequently target their work towards a specificaudience. All of these variables may have contributed to the difference between orderimposed in the experiment <strong>and</strong> reported order of revisions in practice.The students’ reports on what <strong>they</strong> consider is the taught order of making text revisionswas not related to the imposed order in the experiment. Although the students mayhave been taught specific management techniques, it is not clear that a specific orderis always appropriate. Rather than a set order, the writer’s goals may guide <strong>and</strong> changethe revision process (Hayes & Flower, 1987).Implications for future researchOur results on order produce some interesting questions. First, when do writers makejudgements about switching concentration back <strong>and</strong> forth from content to surface? Ifa paper contains few surface errors (eg, a couple of typographical errors), then it maybe appropriate to ignore these errors while improving the content. By contrast, if apaper is riddled with surface errors, it might behove the writer to correct these errorsbefore making content revisions to increase reading comprehension. Second, how doesorder imposed on revisions affect the quality of the final product? Writers who shift oftenbetween processes (ie, planning, text generating, revising <strong>and</strong> reviewing) produce documentsthat <strong>are</strong> rated as being of a higher quality than writers who <strong>are</strong> not as flexiblein shifting (Levy & Ransdell, 1995). Research is needed to investigate how flexibility inmanaging revisions might aid the revision process.© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.


<strong>Spelling</strong> <strong>and</strong> <strong>grammar</strong> <strong>checkers</strong> 731Our results <strong>are</strong> reassuring: while the <strong>checkers</strong> were helpful, their presence did notinterfere with revisions. When <strong>and</strong> how often do writers use the <strong>checkers</strong>? Some writersmay feel more comfortable making revisions from a printout rather than at their computerdesk (Hartley, 1998). Research is needed to better identify the role of <strong>checkers</strong> atvarious stages during revision.Implications for instructionInterestingly, we found that students did not implicitly follow the recommendation oftenmade by writing-style manuals (eg, Troyka, 1999) to revise for content first <strong>and</strong> focuson surface revision later. Are these students using inappropriate revision strategies? Weargue that mixing surface <strong>and</strong> content revisions is probably more efficient than makingmultiple passes through the text. These two types of revision were originally separatedto ease cognitive load (Flower et al, 1986). However, with the aid of <strong>checkers</strong>, revisingfor surface errors requires little cognitive processing <strong>and</strong> therefore likely does not takeaway from higher-order content revision processes. Thus, we advise students to turnon the <strong>checkers</strong> as <strong>they</strong> revise.ReferencesArena, J. (1982). Diagnostic spelling potential test. Novato, CA: Academic Therapy.Baker, L. (1985). How do we know when we don’t underst<strong>and</strong>? St<strong>and</strong>ards for evaluating textcomprehension. In D. L. Forrest-Pressley, G. E. MacKinnon & T. G. Waller (Eds), Metacognition,cognition, <strong>and</strong> human performance: theoretical perspectives Vol. 1 (pp. 155–205). Orl<strong>and</strong>o, FL:Academic Press.Butterfield, E. C., Hacker, D. J. & Albertson, L. R. (1996). Environmental, cognitive, <strong>and</strong> metacognitiveinfluences on text revision: assessing the evidence. Educational Psychology Review, 8,239–297.Daiute, C. A. (1986). Physical <strong>and</strong> cognitive factors in revising: insights from studies with computers.Research in the Teaching of English, 20, 141–159.Figueredo, L. & Varnhagen, C. K. (2004). Detecting a problem is half the battle: the relationbetween error type <strong>and</strong> spelling performance. Scientific Studies of Reading, 8, 337–356.Fitzgerald, J. (1987). Research on revision in writing. Review of Educational Research, 57, 481–506.Flower, L. S., Hayes, J. R., C<strong>are</strong>y, L., Schriver, K. & Stratman, J. (1986). Detection, diagnosis, <strong>and</strong>the strategies of revision. College Composition <strong>and</strong> Communication, 37, 16–55.Francis, W. N. & Kudera, H. (1982). Frequency analysis of English usage: lexicon <strong>and</strong> <strong>grammar</strong>.Boston: Houghton Mifflin.Gupta, R. (1998). Can spell <strong>checkers</strong> help the novice writer? British Journal of Educational Technology,29, 255–266.Hartley, J. (1998). The role of printouts in editing text. British Journal of Educational Technology,29, 277–282.Hayes, J. R. & Flower, L. S. (1987). On the structure of the writing process. Topics in LanguageDisorders, 7, 4, 19–30.Hull, G. (1987). The editing process in writing: a performance study of more skilled <strong>and</strong> lessskilled college writers. Research in the Teaching of English, 21, 8–29.Jinkerson, L. & Baggett, P. (1993). Spell <strong>checkers</strong>: aids in identifying <strong>and</strong> correcting spellingerrors. Journal of Computing in Childhood Education, 4, 291–306.Langone, J., Levine, B., Clees, T. J., Malone, M. & Koorl<strong>and</strong>, M. (1996). The differential effects ofa typing tutor <strong>and</strong> microcomputer-based word processing on the writing samples of elementarystudents with behavior disorders. Journal of Research on Computing in Education, 29, 141–158.© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.


732 British Journal of Educational Technology Vol 37 No 5 2006LeFevre, J., Kulak, A. G. & Heymans, S. L. (1992). Factors influencing the selection of universitymajors varying in mathematical content. Canadian Journal of Behavioural Science, 24, 276–289.Levy, C. M. & Ransdell, S. (1995). Is writing as difficult as it seems? Memory & Cognition, 23,767–779.Lewis, R. B., Ashton, T. M., Haapa, B., Kieley, C. L. & Fielden, C. (1998). Improving the writingskills of students with learning disabilities: <strong>are</strong> word processors with spelling <strong>and</strong> <strong>grammar</strong><strong>checkers</strong> useful? Learning Disabilities: A Multidisciplinary Journal, 9, 3, 87–98.McNaughton, D., Hughes, C. & Ofiesh, N. (1997). Proofreading for students with learning disabilities:integrating computer <strong>and</strong> strategy use. Learning Disabilities Research & Practice, 12,16–28.Pedler, J. (2001). Computer spell<strong>checkers</strong> <strong>and</strong> dyslexics: a performance survey. British Journal ofEducational Technology, 32, 23–37.Poe, E. A. (1966). The cask of Amontillado. Complete stories <strong>and</strong> poems of Edgar Allan Poe (pp. 191–195). New York: Doubleday. (Original work published 1846)Shelley, M. W. (1994). Frankenstein. New York: Penguin Books. (Original work published 1818)Troyka, L. Q. (1999). H<strong>and</strong>book for writers (2nd ed.). Scarborough, ON: Prentice-Hall.© 2006 The Authors. Journal compilation © 2006 British Educational Communications <strong>and</strong> Technology Agency.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!