08.01.2014 Views

Effective Practice with e-Assessment: An overview of ... - Jisc

Effective Practice with e-Assessment: An overview of ... - Jisc

Effective Practice with e-Assessment: An overview of ... - Jisc

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Preparing for summative computer-based<br />

assessment<br />

University <strong>of</strong> Manchester<br />

Background<br />

The School <strong>of</strong> Pharmacy and Pharmaceutical Sciences at<br />

the University <strong>of</strong> Manchester had several years’ experience<br />

in diagnostic and formative computer-based assessment<br />

(CBA) before exploring its use for summative purposes.<br />

Pharmacy – a factual subject <strong>with</strong> large student numbers<br />

and an emphasis on accuracy – is a discipline that lends<br />

itself to CBA, but a lack <strong>of</strong> established examination<br />

protocols, the possibility <strong>of</strong> computer failure and fears for<br />

the security <strong>of</strong> the assessments presented concerns at the<br />

start <strong>of</strong> the six-month pilot scheme.<br />

A total <strong>of</strong> 240 students, some <strong>with</strong> disabilities, taking first<br />

year modules in cell biology and biochemistry were involved<br />

in the pilot. The outcomes, published in 2006 in Pharmacy<br />

Education, 9 indicate that not only can CBA be valid and<br />

reliable, but it can also <strong>of</strong>fer advantages over traditional<br />

methods when applied to the pharmaceutical sciences.<br />

Technologies, systems and policies<br />

The pilot scheme was run according to the existing<br />

University <strong>of</strong> Manchester framework for examinations, but<br />

consideration was also given to the SQA guidelines on online<br />

assessment for further education. 10 The assessment tool in<br />

WebCT ® – the VLE <strong>of</strong> choice at Manchester – was used to<br />

deliver the assessments, since data from the university<br />

student record system could be fed directly into the VLE and<br />

used in setting up assessments. Examination questions<br />

written by academic staff in Micros<strong>of</strong>t ® Word were imported<br />

into WebCT using Respondus ® 3.0, a tool for creating tests<br />

<strong>of</strong>fline which uses a Windows ® interface.<br />

The online assessment team have since investigated<br />

case stu<br />

additional tools to increase the efficiency <strong>of</strong> this process –<br />

experience has shown that import s<strong>of</strong>tware does not always<br />

<strong>of</strong>fer a total solution at this level. For example, the use <strong>of</strong><br />

28<br />

decimal points, commonly used in the writing <strong>of</strong> numeric<br />

questions, caused question import filters to stumble in ways<br />

that are difficult to predict. Hence, some final editing had to<br />

be carried out <strong>with</strong>in the VLE.<br />

Student identity was authenticated by students logging on to<br />

the VLE using their normal university username and<br />

password, and was backed up by the invigilator checking that<br />

students taking the examination could be identified from<br />

photos on their university ID cards. Timed release was used<br />

to ensure that examinations could be accessed only during<br />

the timetabled period and, for additional security, a testspecific<br />

password was issued. This was given to candidates<br />

only when the examination commenced.<br />

The possibility <strong>of</strong> a technical failure on the day <strong>of</strong> the<br />

examination remained a concern. To prepare for this<br />

eventuality, paper versions <strong>of</strong> the tests were produced as a<br />

backup and, as a further fail-safe mechanism, candidates<br />

were asked to enter their responses online and on an optical<br />

mark reader sheet. This prevented the examination from<br />

being a fully computer-based one, in which random ordering<br />

<strong>of</strong> questions could take place, but nonetheless enabled a<br />

useful comparison between different methodologies during<br />

the pilot. The process relied on the assistance <strong>of</strong> a support<br />

team, a backup server operated throughout the examination<br />

and computers were booked at 90% capacity to allow for the<br />

breakdown <strong>of</strong> more than one machine.<br />

Rethinking assessment practice<br />

The pilot study revealed that some adaptations were<br />

necessary. A disadvantage <strong>of</strong> the WebCT marking tool,<br />

for example, was its inability to interpret the range <strong>of</strong><br />

inaccuracies in spelling that could occur in otherwise correct<br />

answers. Results from preliminary practice assessments<br />

showed that small errors, such as the inclusion <strong>of</strong> a hyphen,<br />

could be marked as incorrect by the computer, even though<br />

allowed by a human marker.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!