24.07.2013 Views

October 2007 Volume 10 Number 4 - Educational Technology ...

October 2007 Volume 10 Number 4 - Educational Technology ...

October 2007 Volume 10 Number 4 - Educational Technology ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

3. Join Group Discussion<br />

4. Attend Lectures, Seek Help, and Complete Self-Assessment<br />

5. Monitor the Assignment Discussion<br />

6. Explore On-line resources<br />

7. Complete Individual Tutorial<br />

One issue that had to be faced was of training. When the system was fully developed, student users would be<br />

supplied with a CD or DVD containing the software and course materials, a demonstration video and a hardcopy<br />

guide. Since at this stage there was no demonstration video, some training had to be provided to the users. Each user<br />

was given a demonstration of the student software as it would be shown in the video and then completed the first two<br />

exercises under supervision. They were then left to complete the scenarios unassisted, except for the provision of a<br />

sheet of Handy Hints that would form part of the hardcopy guide.<br />

Three forms of data collection were used during the experiment: logging their activities (Dix et al., 1993),<br />

observation and interviews (Patton, 1990, Scott et al, 1991). Where, when and what each user did during their<br />

sessions was tracked by the system and saved in a log file, primarily to analyse users' navigation paths and<br />

completion times, and record where they ran into difficulties. Because the student software is designed to support<br />

user-centred exploration and multiple navigation paths, a user should be able to complete a task even it they diverge<br />

from the shortest path one would expect an experienced user to follow.<br />

Each participant was interviewed individually, immediately after they had completed each session (i.e. twice). Semistructured<br />

interviews took place. A set of questions was prepared as the starting point for exploring the person's<br />

experience with the prototype. Questions such as the difficulties faced, what aspects of the system they liked, and<br />

what they would want changing were asked. These interviews were recorded and transcribed.<br />

Results of the Evaluation<br />

IMMEDIATE ran successfully and without major incident under quite challenging conditions. The results of this<br />

evaluation were very positive, with all volunteers able to complete the exercises within the two one-hour sessions.<br />

All three stated that by the end of the exercises they were confident that they were able to use the system unaided.<br />

The users were only observed working through the first two scenarios. The data from this observation was of<br />

somewhat secondary value because these two exercises were part of the supervised demonstration. However it did<br />

confirm the data from their profile questionnaires indicating that they represented a range of computing experience<br />

from confident to unsure. Two of the users were inhibited by previous negative experiences with Windows,<br />

including the fear of crashing the system if they did something wrong.<br />

An analysis of the log files revealed that whilst each of the volunteers got held up at least once they were able to<br />

complete all their scenarios. User 2, the least experienced user, took significantly longer to complete several<br />

scenarios (Figure 3).<br />

The most fruitful form of data collection proved to be the interviews which were undertaken with each user<br />

individually at the end of each one-hour session. Each volunteer was asked the same set of questions, as the basis for<br />

exploring their experience with the prototype. The interviews averaged between 20 and 30 minutes. No major<br />

usability problems were identified during the interviews.<br />

The complete interviews were transcribed and then analysed for significant, common themes relating to the goals of<br />

the evaluation. Major themes identified were:<br />

• The complexity of the general-purpose PC environment.<br />

• The problem of poor Internet service in rural areas.<br />

• The negative effects of personal isolation.<br />

• The importance of training and help.<br />

• Simplifying searches to avoid information overload.<br />

Support for the Learning Shell.<br />

150

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!