11.07.2015 Views

3 - University of Hertfordshire

3 - University of Hertfordshire

3 - University of Hertfordshire

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Editorial3A warm welcome to the May issue <strong>of</strong> Blended Learning inPractice. As you will see from the contents page, this is a„themed‟ issued with a focus on assessment. Throughrecognition <strong>of</strong> the Sector‟s need to improve assessment andfeedback (as witnessed via the National Student Survey) the<strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> is engaging in a year-long projectfocused on improving assessment and feedback. Many <strong>of</strong> ushave revisited our assessment and feedback practice andconsidered where, and how, improvements can be made toenhance the student educational experience.Phil PorterIt is hoped that the contents <strong>of</strong> this issue <strong>of</strong> Blended Learning in Practice will providemuch useful material to assist in assessment re-engineering and I would particularlydirect readers to the „Student Voice‟ section as a revealing start point.Our first research paper is from Iain Cross, who has made extensive use <strong>of</strong> electronicvoting systems (EVS) in the teaching <strong>of</strong> Geography and Environmental Science. Iainconsiders how EVS can be used to assist the implementation <strong>of</strong> Chickering andGamson‟s seven principles <strong>of</strong> good practice in higher education and concludes thatthe potential is significant.Sarah Flynn and Mark Russell then compile their „top tips‟ for good practice inassessment for learning. These „tips‟ emerged from discussions at a recent <strong>University</strong><strong>of</strong> <strong>Hertfordshire</strong> Learning and Teaching Institute forum meeting and we hope they willprovide some useful material for readers when reflecting on assessment practice.The second research paper comes from Helen Barefoot. Helen presents the results<strong>of</strong> a study to assess the potential for enhancing student performance through the use<strong>of</strong> peer assessment. Enhanced student grades following the inclusion <strong>of</strong> peerassessment indicate that this technique may be an extremely beneficial educationaltool and results from reflective questionnaires administered to students certainlysupport this conclusion, with the majority <strong>of</strong> students indicating that the process waseducationally valuable.Several staff members from the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> then share their practicalexperience <strong>of</strong> using EVS across a range <strong>of</strong> disciplines, including; Geography andEnvironmental Science, Computer Science, Business, Bioscience, Anatomy andPhysiology. Invaluable reading both for teachers who are taking their first steps inEVS use and for those who may be more experienced.In our final research paper Sue Roscoe provides an example <strong>of</strong> how she used agroup „wiki‟ as an assessment exercise for level 4 (first year Undergraduate)Physiotherapy students. Sue considers the merits and potential drawbacks <strong>of</strong> thismode <strong>of</strong> assessment and provides a variety <strong>of</strong> useful findings for colleagues who mayBlended Learning In Practice May 2011


4 Editorialbe seeking to diversify their assessment techniques through the use <strong>of</strong> wikis.Finally, in our regular „Student Voice‟ section, I ask students for their personal views onassessment and feedback. Current <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> students Lucia Lencioniand Martin Smart provide us with answers to questions that we may fear to ask: if youwant to know what their definition <strong>of</strong> „prompt feedback‟ is, then read on!I hope that you find this latest assessment-themed issue <strong>of</strong> Blended Learning in Practiceto be both enjoyable and useful. Please don‟t hesitate to contact me for furtherinformation on any <strong>of</strong> the above articles and for information concerning material you maywish to submit to the journal.With best wishesDr Philip Porter (p.r.porter@herts.ac.uk)Editor, Blended Learning in Practice.Blended Learning In Practice May 2011


Contributor Pr<strong>of</strong>iles5Phil Porterp.r.porter@herts.ac.ukAlong with being the editor <strong>of</strong> Blended Learning in Practice,Phil Porter is a Senior Lecturer in Physical Geography at the<strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> in the School <strong>of</strong> Life Sciences andhas been active in glaciological research since 1993. Aftercompleting a PhD (Leeds) in borehole instrumentation <strong>of</strong> fastflowing glaciers, Phil took up lectureships at Manchester andLeeds and joined the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> in 2003. Hiscurrent research interests concern the response <strong>of</strong> thecryosphere to environmental change. Phil is also a LTIteacher taking a lead on „research informed teaching„ and iscurrently preparing a Higher Education Academy (HEA)funded field expedition to the Swiss Alps with Dr Iain Cross(see Iain‟s pr<strong>of</strong>ile below) where they will be working with UHstudents and Swiss scientists on a project to engage our studentswith research fieldwork techniques in geography andenvironmental science.Iain Crossi.cross@herts.ac.ukIain is a lecturer in Physical Geography in the School <strong>of</strong> LifeSciences at the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>. Iain completed hisPhD in eutrophication <strong>of</strong> shallow lakes at the <strong>University</strong> <strong>of</strong> Nottinghamin 2009 and continues to expand his research interestsin aquatic environments through collaborative research at Nottinghamand <strong>Hertfordshire</strong>. Since joining the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>in 2009, Iain has contributed to numerous modulesand fieldtrips, and is currently preparing to travel to Switzerlandwith Dr Phil Porter (see Phil‟s pr<strong>of</strong>ile above) to take part in aHEA funded expedition to the Swiss Alps in June 2011. Iain hasalso been awarded a Nuffield Research Bursary to fund collaborativeresearch with a UH undergraduate student duringsummer 2011. Iain‟s teaching draws on a range <strong>of</strong> novel approaches,including electronic voting systems, wikis and videotechnologies. In particular, Iain‟s interests lie in the philosophyand application <strong>of</strong> EVS to improving the undergraduate learningexperience.Blended Learning In Practice May 2011


6 Contributor Pr<strong>of</strong>ilesHelen BarefootHelen Barefoot graduated from the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>in 1998 and went on to complete a PhD at Cambridge<strong>University</strong>. She returned to <strong>Hertfordshire</strong> in 2001as a lecturer in human physiology, specialising in theteaching <strong>of</strong> Neuroscience. Helen is currently the DeputyHead <strong>of</strong> the Learning and Teaching Institute (LTI) andcontinues to teach Neuroscience within the School <strong>of</strong> LifeSciences. Helen leads the continuing pr<strong>of</strong>essional development(CPD) programme for academic staff and her currentresearch interests focus on learning, teaching andassessment activities which enhance the student learningexperience.h.barefoot@herts.ac.ukSue RoscoeS.L.Roscoe@herts.ac.ukSue Roscoe is a Senior Lecturer in Physiotherapywithin the School <strong>of</strong> Health and Emergency Pr<strong>of</strong>essions(HEP). She is the Physiotherapy Practice TeamLead and her work has mainly been within the field <strong>of</strong>practice placement support. Her interests lie in educatortraining and support. She is currently undertaking aMaster‟s degree in the field <strong>of</strong> Education.Blended Learning In Practice May 2011


7Supporting good practice in undergraduate education….Supporting good practice in undergraduate education usingElectronic Voting Systems (EVS)Iain CrossSchool <strong>of</strong> Life Sciences, <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>i.cross@herts.ac.ukAbstractTeaching staff in higher education institutions throughout the world are increasingly facedwith large cohorts to teach. It can therefore be difficult to implement the seven principles <strong>of</strong>good practice identified by Chickering & Gamson (1987). Electronic Voting Systems (EVS)can potentially assist tutors in delivering such good practice. Both direct and indirect linksbetween the use <strong>of</strong> EVS and Chickering & Gamson‟s seven principles can be made. Directlinks map strongly between the principle and the use <strong>of</strong> EVS, whereas indirect linksevoke other mechanisms or suggest a secondary role for EVS within teaching practice.Direct links occur between encouraging active learning and giving prompt feedback, byremoving the lack <strong>of</strong> anonymity to responding in class and rapid collation <strong>of</strong> student responsesto questions. Indirect links include emphasising time on task and communicatinghigh expectations. It is argued that EVS <strong>of</strong>fers significant potential towards supportinggood practice in undergraduate education.1.IntroductionTutors in higher education are increasingly teaching large groups <strong>of</strong> students. Largergroups can be challenging to teach, as individual identities and learning needs or stylesmay become diluted. Tutors may struggle to engage large audiences, not through overwhelmingamounts <strong>of</strong> response but by a reticence to engage on the part <strong>of</strong> the studentaudience. Many tutors will have experienced this, particularly when subject material is abstractor complex. Therefore, encouraging active learning within large groups can be difficult,and many tutors become the „sage on the stage‟, filling the student „vessels‟ withknowledge to be recited during examinations (Moss & Crowley, 2011). Because activelearning increases exam performance, this difficulty <strong>of</strong> engagement needs to be overcome.Electronic Voting Systems (EVS) <strong>of</strong>fer a potential mechanism by which large audiencescan be engaged in processing material delivered through lectures, ultimately enhancingstudent performance. In this paper the potential <strong>of</strong> EVS to support good practice for undergraduateeducation is addressed, drawing on the Principles <strong>of</strong> Good Practice in HigherEducation, defined by Chickering & Gamson (1987), to indentify direct and indirect linkswith good practice. Also considered is the view expressed in some quarters that EVS issimply an „amusing novelty‟ (Lantz, 2010), „pretty lights‟ to decorate difficult subject matter(King & Robinson, 2009), or simply a distracting and time-consuming diversion for studentsfrom assimilating material delivered by a lecturer. Should EVS be considered a replacementfor traditional teaching techniques (Figure 1), or be viewed as a tool in theteacher‟s armoury to stimulate active learning in large student cohorts?Blended Learning In Practice May 2011


Supporting good practice in undergraduate education….8Figure 1. The implementation <strong>of</strong> new technology may be best as a compliment, not a replacement,to traditional teaching methods (Source: www.glasbergen.com/wp-content/gallery/teachers-and-staff/edu31.gif).2. What are EVS systems?EVS (or audience response systems, ARS [e.g. Hay & LeSage, 2009]; classroom clickersystem, CCS or „clickers‟ [e.g. Boyle & Nicol, 2003]; student response systems, SRS [e.g.Stav et al., 2010]) use wireless technology to record and store the responses <strong>of</strong> individualsto multiple choice questions (MCQs). Typically, three hardware components are required:multiple handsets (clickers), a PC and a receiving unit. These components aresupported by s<strong>of</strong>tware to allow the creation <strong>of</strong> „interactive‟ slides within Micros<strong>of</strong>t Power-Point and retain voting data. A typical sequence <strong>of</strong> use involves a multiple choice question(MCQ) being designed and displayed followed by students responding (e.g. Lou &Lorimer, 2010). A summary <strong>of</strong> the responses to each choice is then displayed within thePowerPoint presentation. The data used to build the summary display may then be storedand manipulated to provide more detailed analysis <strong>of</strong> the responses, such as the performance<strong>of</strong> individual respondents within a session.3. EVS to support good principles <strong>of</strong> undergraduate educationBased on reflection and experience, it can be argued that EVS brings four key advantagesto learning:Promotion <strong>of</strong> active learning by asking students to synthesise and digest informationafter delivery, and challenging their knowledge and understanding;Generation <strong>of</strong> student feedback by providing an instant assessments <strong>of</strong> studentresponses to questions;Generation <strong>of</strong> staff feedback through demonstrating the effectiveness <strong>of</strong> teach-Blended Learning In Practice May 2011


Supporting good practice in undergraduate education….9ing through student performance;Encourages attendance and engagement by assisting in recording presencein lectures and monitoring <strong>of</strong> students‟ performance.These points are discussed in detail in sections 4.1 and 4.2 and the potential for furtherdevelopment <strong>of</strong> the advantages is presented.Each key advantage can be mapped onto Chickering & Gamson‟s (1987) Seven Principlesfor Good Practice in Undergraduate Education. These have been widely used inhigher education to structure attempts at improving the learning experience for undergraduatestudents. Chickering & Gamson argue that although any <strong>of</strong> the seven principles(shown as red boxes in Figure 2) can stand alone, they are multiplicative. In effect, theproduct <strong>of</strong> all is greater than their total sum. The principles are also applicable to all subject-areasand students, regardless <strong>of</strong> academic content, or student ability (Chickering &Gamson, 1987). EVS may not support each <strong>of</strong> the good practice principles directly, butstill <strong>of</strong>fers significant opportunities to enhance good practice through more tangentiallinks. Figure 2 shows how EVS can enhance the delivery <strong>of</strong> good practice by either „direct‟or „indirect‟ links with Chickering & Gamsons‟ seven principles. Each set <strong>of</strong> links is consideredseparately in the following discussion.Respects diversetalentsActivelearningStudentfeedbackStaff feedbackEmphasises timeon taskActivelearningStudentfeedbackStudent /staff contactReciprocity betweenstudentsCommunicates highexpectationsAttendance andengagementFigure 2. A conceptual model <strong>of</strong> the direct and indirect links by which EVS can supportprinciples for good practice in undergraduate education (Chickering & Gamson, 1987).Blue boxes represent key aspects <strong>of</strong> EVS in teaching and learning and red boxes are theseven principles. Principles and aspects within the dotted box are direct links, other linksare considered indirect.Blended Learning In Practice May 2011


Supporting good practice in undergraduate education….104.1 Direct Links with Chickering and Gamson’s PrinciplesEVS supports active learning through anonymity„Active learning‟ is considered by many Higher Education (HE) practitioners to be a cornerstonefor improving undergraduate education. A number <strong>of</strong> definitions exist, but allcan be paraphrased to suggest that active learning is „learning by doing‟. „Doing‟ can bea range <strong>of</strong> activities, principally reading, writing, discussing and solving problems. All fouractivities promote deeper learning <strong>of</strong> material through encouraging analysis, synthesisand evaluation (Bonwell & Eison, 1991). Active learning is thought to improve exam performancewhen compared to the traditional lecture format (Crossgrove & Curran, 2008;Freeman et al., 2007; Yoder & Hochevar, 2005 in Lantz, 2010), and is therefore widelypromoted as a principle for quality undergraduate learning.One <strong>of</strong> the most basic examples <strong>of</strong> active learning is asking students to respond to aquestion. This is <strong>of</strong>ten met by a reticence to respond on the party <strong>of</strong> students (e.g. Steinert& Snell, 1999). Anecdotal experiences suggest that there may be a number <strong>of</strong> reasonsfor this (Box 1). Concerns <strong>of</strong> how peers will view participation are particularly importantfrom the undergraduate perspective, and may generate particularly strong concernswhen there is a reasonable likelihood <strong>of</strong> giving a wrong answer. Experiences <strong>of</strong> teachingsmall groups within the School <strong>of</strong> Life Sciences at the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> suggeststhat even smaller class sizes may not improve the likelihood <strong>of</strong> student response toquestions (corroborating the findings <strong>of</strong> Draper & Brown, 2004), particularly when thereare strong social groups present amongst the class. Ultimately, few students will respondpublically and any students that are comfortable with answering quickly become establishedas „class-answerers‟, allowingthe remainder <strong>of</strong> the class toBox 1. Experiences <strong>of</strong> barriers to participation.simply wait for them to answer. From the undergraduate‟s perspective:This phenomenon has been observedfrequently in lectures, and „Uncool‟ image <strong>of</strong> giving correct answerFear <strong>of</strong> giving incorrect answerGenuine lack <strong>of</strong> knowledgeanecdotal evidence suggests thatConfusion as to what the tutor is askinga lack <strong>of</strong> knowledge or understandingis not the only barrier to Disinterest in subject matterFeeling answer is too simple to be correctengagement: many able students Lack <strong>of</strong> incentive to think <strong>of</strong> correct answershy away from responding publically.A number <strong>of</strong> studies have Poor phrasing <strong>of</strong> questionFrom the tutor‟s perspectivesuggested that raising hands, Question formulated at inappropriate levelshowing response cards or peer Question unintentionally exclusive – fails to crossbarriers associated with cultural, linguistic,discussions regarding the responseto a question may im-Timing <strong>of</strong> question inappropriate to flow <strong>of</strong> mate-disabilities etcprove participation and performanceby removing the peer-Question set as group work but individuals rely onrial <strong>of</strong> stage <strong>of</strong> lectureothers to answerpressure <strong>of</strong> verbally responding inpublic (Freeman et al., 2006;Jones et al., 2001). Husinger et al. (2008) suggest that for cultural reasons Asian stu-Blended Learning In Practice May 2011


Supporting good practice in undergraduate education….11dents may be particularly discouraged from giving publics answers due to a potential loss<strong>of</strong> respect or esteem.EVS supports active learning by encouraging students to respond to questions posed bylecturers because their responses are anonymous. Students who may be reluctant to discussopinions or answers with tutors may be more likely to respond when using EVS (e.g.King & Robison, 2009; Jones et al., 2001). A number <strong>of</strong> researchers have found evidencewhich supports the findings <strong>of</strong> Jones et al., 2001. For example, Freeman and Bayley(2005) found that when comparing EVS to traditional response types (raising hands), studentsreported significantly higher interaction in lectures, understanding <strong>of</strong> material andself-evaluation <strong>of</strong> learning. Sharma et al. (2005) reported that 38% <strong>of</strong> students were „verycomfortable‟ to use EVS to answer questions in a large lecture, but only 4% would behappy to answer verbally. Amongst Asian students, EVS has been shown to increase participation.This may be related to Chinese students potentially being used to a tightlystructured form <strong>of</strong> „knowing‟ information, and that this information is perceived to be deliveredby a tutor rather than a peer (Cortazzi, 2002 in Beekes, 2006). Therefore, discussionand peer-led learning may be less appealing to Chinese students than the supposedstructure <strong>of</strong> an EVS-based quiz. However, the results reported by Freeman and Bayley(2005) found that language, local or international origin, gender and disability were notsignificant in explaining differences in students perceptions.Freeman & Bayley (2005) explore the role <strong>of</strong> the lecturer when expecting and encouragingstudents to respond in class. Although a class was conducted using traditional responsemethods as a control for comparison with EVS responses, the lecturer felt thatwhen using EVS he was“freer to encourage student engagement and interaction…without <strong>of</strong>fendingany student because they would be anonymous…[Insisting or waiting] mayhave been insensitive or discouraging [when not using EVS].”(Freeman & Blayney, 2005, p.30)Students may perceive that a lecturer is less willing to wait or insist on a response, andtherefore be less motivated to engage and answer the question when using traditional responsemethods. Furthermore, Freeman & Balyney (2005) suggest that the lecturer mayhave altered the time for response between EVS and traditional response types because<strong>of</strong> the difficulty <strong>of</strong> accurately estimating how many students had answered the questionbut were unwilling to publicly admit they had. One advantage <strong>of</strong> EVS is that this informationis easy to acquire, allowing for accurate pacing <strong>of</strong> questioning and data on voting orabstinence from voting.In summary, asking students to respond to questions in class is a fundamental element <strong>of</strong>active learning. Personal experiences <strong>of</strong> teaching and evidence from the literature suggeststhat students can be uncomfortable answering questions publicly. By anonymisingBlended Learning In Practice May 2011


Supporting good practice in undergraduate education….12responses to questions, more students can become active learners as they are morelikely to answer questions.EVS facilitates prompt student feedbackAssessment is an important tool for allowing students to understand their progress andunderstanding <strong>of</strong> a topic, and a good opportunity for a tutor to gauge the performance <strong>of</strong>a class and individuals within it (Nicol & MacFarlane-Dick, 2006). Lantz (2010) suggeststhat feedback works by <strong>of</strong>fering a corrective mechanism, replacing the incorrect answerfrom the memory <strong>of</strong> students. In order to maximise the benefit <strong>of</strong> feedback, studentsshould be aware <strong>of</strong> the correct answer in addition to knowing whether they are right orwrong (Travers et al., 1964, in Lantz, 2010). Beatty (2004) argues that students are ableto see the limitations <strong>of</strong> their own knowledge so they can correct this and concentratetheir learning appropriately. Beatty (2004) further argues[EVS]-based instruction helps [students] take charge <strong>of</strong> their own learning,seeking out the information and experiences they need to progress… it canimpact their approach to learning beyond class, helping them transform intomore motivated, empowered [and] aggressive learners.Traditional systems <strong>of</strong> providing feedback rely on a physical process <strong>of</strong> marking or gradinganswer sheets by a tutor and returning to students. This is frequently a lengthy processfor students and tutors (Russell, 2008) and can require a significant administrativeinput. Personal experience suggests that the time commitment for tutors assessing essaysis little different to that when marking short-answer questions. In this „manual‟ model<strong>of</strong> feedback provision, it is the responsibility <strong>of</strong> the tutor to provide the correct answer tostudents to gain the benefit <strong>of</strong> feedback. However, when undertaking a significant quantity<strong>of</strong> assessment, particularly when comments are repetitive, tutors are less likely toprovide the correct answer. Therefore, the potential to provide en masse corrections tostudents‟ understanding in class may significantly improve their memory and retention <strong>of</strong>correct answers, potentially improving their performance in examinations.Feedback provided rapidly is thought to be more useful to students than delayed feedback.A meta-analysis <strong>of</strong> 53 case studies by Kulik & Kulik (1988) found that delaying theresults <strong>of</strong> quizzes or tests never improved student‟s performance in recalling correct answers,but rapid feedback significantly improved performance in some studies. Kulik &Kulik (1988) stress the importance <strong>of</strong> appreciating the conditions in which experimentalor observational data is given, and encourage a cautious interpretation <strong>of</strong> their findings.However, Fulmar & Rollings (1976) found that students who received immediate feedbackusing a form <strong>of</strong> EVS performed significantly better when recalling answers than studentswho received no immediate feedback. Although research evidence does not supporta motivational element, personal experience suggests that students who feel theyunderstand material by <strong>of</strong>fering correct answers are frequently more motivated to studyfurther. EVS can extend this motivation to all students in a class who answer correctly,rather than an individual student responding to a question.Blended Learning In Practice May 2011


Supporting good practice in undergraduate education….13EVS has a significant potential to the improve the immediacy <strong>of</strong> feedback given by tutors.EVS systems <strong>of</strong>fer the tutor the opportunity to provide immediate feedback to a captiveaudience. Typically, the number <strong>of</strong> students selecting each answer is shown when votingis complete. An effective use <strong>of</strong> EVS is then for a brief discussion to take place, led by thetutor, in which the reasons for selecting the incorrect answers can be discussed and thedetails <strong>of</strong> the correct answer elaborated on. This is a particularly important element <strong>of</strong> usingEVS, since multiple choice questions do not naturally foster deep understanding <strong>of</strong>subject matter and may over-emphasise the opportunity to cover large quantities <strong>of</strong> materialwith little depth <strong>of</strong> understanding (Biggs, 1996). This highlights the critical role <strong>of</strong> thetutor in devising appropriate material (see section 4.2 „communicating high expectations‟).In contrast, Draper (2009) suggests that EVS and multiple-choice questions can be usedin a deeper way, acting as a „catalytic assessment‟, and suggests practical advice for this(see Box 2). EVS directly supports the principle <strong>of</strong> giving prompt feedback, and by doingso <strong>of</strong>fers students the opportunity to target their learning to topics on which they performpoorly.Box 2. Using multiple choice questions and EVS for catalytic assessmentand deeper learning (from Draper, 2002, p291).1. Assertion–reason questions, which can be and have been used with EVS.2. Taking an MCQ and having the learner generate reasons, for and againsteach response option, rather than simply ticking one. (This is usually done onpaper as a private revision technique.)3. Confidence-based marking, which is normally delivered by ICT and couldbe done with EVS with some (but not all) s<strong>of</strong>tware.4. Mazur‟s method <strong>of</strong> using brain-teasers to prompt peer discussion, which isroutinely done using EVS.4 . 2 5. Having students create MCQs as part <strong>of</strong> presentations using EVS.Indirectsupporting<strong>of</strong>6. Having students create MCQs for use in tests that may be administeredeither by using EVS or on paper.Chickering&G a m -son’s Good Practice.In addition to the direct links <strong>of</strong> feedback and active learning, EVS can contribute to many<strong>of</strong> the other Principles in indirect ways. „Indirect‟ in this context refers to the supporting linkbeing less clear or more obscure. Indirect links may not have such a strong associationBlended Learning In Practice May 2011


Supporting good practice in undergraduate education….16Respects diverse talents and ways <strong>of</strong> learning: It is recognised that students have anumber <strong>of</strong> different „cognitive styles‟ used when learning. Cognitive styles are defined as„an individual‟s consistent approach to organising and processing information duringlearning‟ (Riding & Sadler-Smith, 1997). EVS is unlikely to promote an approach to learningthat is satisfactory to all students. EVS is more likely to be a contrast to what manystudents may have experienced during their education before university, and also differentto their likely expectations <strong>of</strong> lectures and seminars. However, exposing undergraduatesto a range <strong>of</strong> learning styles is likely to be beneficial for them, improving theiradaptability and highlighting styles that may be new to them.5. ConclusionsIn conclusion, EVS <strong>of</strong>fers the opportunity for tutors to support the principles <strong>of</strong> good undergraduateeducation as presented by Chickering & Gamson (1987). These links areeither explicit (e.g. providing feedback and encouraging active learning) or less clear(e.g. enhancing student-tutor contact time). Evidence from the literature and reflectionson personal experience suggest that when EVS is used to assist learning, students areempowered to take ownership <strong>of</strong> their learning, engage with tutors and subject materialand participate in a learning community. Improved attendance and attention duringclasses may further enhance students‟ learning experiences. Interactively, these processescan enable students to improve their academic performance compared to whenEVS is not used. Although unpacking the precise contribution <strong>of</strong> each element to students‟overall learning experience, EVS undoubtedly has a strong potential to improvethe implementation <strong>of</strong> good practice in undergraduate education.AcknowledgementsI am grateful to colleagues at the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> for their constructive criticismsand stimulating discussions during the preparation <strong>of</strong> this paper, particularlySusanna Mason from Clinical and Community Pharmacy, and Angela Hammond andPhilip Porter from the Learning and Teaching Institute. The kind patience <strong>of</strong> studentswhilst I overcame the technical and academic challenges <strong>of</strong> using EVS is also acknowledged.ReferencesBeatty, I. (2004) Transforming student learning with classroom communication systems.EDUCAUSE Centre for Applied Research: Research Bulletin 3: Boulder, Colorado.Beekes, W. (2006) „The “Millionaire” method for encouraging participation‟ Active Learningin Higher Education 7 (1): 25-36.Blended Learning In Practice May 2011


Supporting good practice in undergraduate education….17Biggs, J. (1996) „Enhancing teaching through constructive alignment‟ Higher Education 32(3): 347-364.Bonwell, C. C. & Eison, J. A. (1991) Active learning: creating excitement in the classroom.ERIC Clearinghouse on Higher Education, George Washington <strong>University</strong>, Washington.Boyle, J. T. & Nicol, D. J. (2003) „Using classroom communication systems to support interactionand discussion in large class settings‟ ALT-J: Research in Learning Technology11 (3): 43-57.Blumenfeld, P. C., Marx, R. W., Soloway, E. & Krajcik, J. (1996) „Learning with peers:From small group cooperation to collaborative communities‟ Educational Researcher 25(8): 37-40.Chickering, A. W. & Gamson, Z. F. (1987) „Seven principles for good practice in undergraduateeducation‟ American Association <strong>of</strong> Higher Education Bulletin 39 (7): 3-7.Crossgrove, K. & Curran, K. L. (2008) „Using clickers in Nonmajors- and Majors-level biologycourses: student opinion, learning, and long-term retention <strong>of</strong> course material‟ CBE-Life Sciences Education 7: 146-154.Draper, S. W. (2009) „Catalytic assessment: understanding how MCQs and EVS can fosterdeep learning‟ British Journal <strong>of</strong> Educational Technology 40 (2): 285-293.Draper, S. W. & Brown, M. I. (2004) „Using a personal response system for promoting studentinteraction‟ Journal <strong>of</strong> Computer Assisted Learning 20 (2) 81-94.Draper, S. W., Cargill, J. & Cutts, Q. (2002) „Electronically enhanced classroom interaction‟Australian Journal <strong>of</strong> Educational Technology, 18 (1): 13-23.Freeman, M. & Blayney, M. (2005). „Promoting interactive in-class learning environments:A comparison <strong>of</strong> an electronic response system with a traditional alternative‟. Proceedings<strong>of</strong> the 11 th Australasian Teaching Economics Conference, pp 23-33.Freeman, S., O‟Connor, E., Parks, J. W., Cunningham, M., Hurley, D., Haak, D., Dirks, C.& Wederoth, M. P. (2007) „Prescribed active learning increases performance in introductorybiology‟ CBE – Life Sciences Education 6: 132-139.Hay, R. H. & LeSage, A. (2009) „Examining the benefits and challenges <strong>of</strong> using audienceresponse systems: A review <strong>of</strong> the literature‟ Computers and Education 53: 819-827.Husinger, M., Poirer, C. R. & Feldman, R. S. (2008) „The roles <strong>of</strong> personality and classsize in student attitudes toward individual response technology‟ Computers in Human Behaviour24: 2792-2798.Blended Learning In Practice May 2011


Supporting good practice in undergraduate education….18Jones, C., Connolly, M., Gear, A. & Read, M. (2001) „Group interactive learning withgroup process support technology‟ British Journal <strong>of</strong> Educational Technology 32: 571-581.King, S. O. & Robinson, C. L. (2009) „”Pretty Lights” and Maths! Increasing student engagementand enhancing learning through the use <strong>of</strong> electronic voting systems‟ Computersand Education 53: 189-199.Lantz, M. E. (2010) „The use <strong>of</strong> clickers in the classroom: Teaching innovation or merelyan amusing novelty?‟ Computers in Human Behaviour 25: 556-561.Lou, F. & Lorimer, J. (2010). How to use TurningPoint. Blended Learning in Practice, 27-34.Moss, K. & Crowley, M. (2011) „Effective learning in science: the use <strong>of</strong> personal responsesystems with a wide range <strong>of</strong> audiences‟ Computers and Education 56: 36-43.Nicol, D. J. & MacFarlane-Dick, D. (2006) „Formative assessment and self-regulatedlearning: A model and seven principles <strong>of</strong> good feedback practice‟ Published Studies inHigher Education 31 (2): 199-218.Riding, R. J. & Sadler-Smith, E. (1997) „Cognitive style and learning strategies: some implicationsfor training design‟ International Journal <strong>of</strong> Training and Development 1 (3): 199-208.Russell, M. (2008) „Using an electronic voting system to enhance learning and teaching‟Engineering Education 3 (2): 58-65.Stav, J., Nielsen, K., Hansen-Nygård, G. & Thorseth, T. (2010) „Experiences obtainedwith integration <strong>of</strong> student response systems for iPod Touch and iPhone into e-LearningEnvironments‟ Electronic Journal <strong>of</strong> e-Learning 8 (2): 179-190.Steinert, Y. & Snell, L. S. (1999) „Interactive lecturing: strategies for increasing participationin large group presentations‟ Medical Teacher 21 (1):37-42.Zhao, C.-M. & Kuh, G. D. (2004) „Adding value: Learning communities and student engagement‟Research in Higher Education 45 (2): 115-138.Blended Learning In Practice May 2011


Assessment for learning in practice19Assessment for learning in practiceSharing good teaching and learning practice is clearly a valuable exercise for all educators.At a recent meeting <strong>of</strong> the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> Learning and Teaching NetworkForum colleagues were asked to share examples <strong>of</strong> „good practice‟ in assessmentfor learning. Listed below are a selection <strong>of</strong> examples that were shared and we hope thatthey may prove useful to readers. Thanks to Sarah Flynn and Mark Russell <strong>of</strong> the <strong>University</strong><strong>of</strong> <strong>Hertfordshire</strong> Teaching and Learning Institute for compiling and editing these examples.Good practice in assessment for learning…Engages students with assessment criteriaAsking students to mark another student‟s work (peer assessment) not only shows themhow others tackle the same task but it also helps in understanding the assessment (andmarking) criteria.Try showing students a selection <strong>of</strong> student submissions to a similar assessment task andask them to rank the submissions and explain why they have ranked them in the way theyhave. Discussions might cover what “good” looks like and how we know it, how well thesubmissions meet the brief and/or the assessment criteria and how they take the learningfrom this activity and apply it to their own assessment task.…Supports personalised learningWhen students submit their work, ask them which areas <strong>of</strong> the submission they wouldparticularly like feedback on. Prioritising your effortsresponding to these requests encourages “ Set assessment tasks thatthe students to be active in progressing their ownrequire students to…linklearning, seeking feedback and then acting on it.concepts together…”When students submit their work, ask them toprovide a short narrative/reflection <strong>of</strong> their perceivedstrengths and weaknesses <strong>of</strong> the submission. You could then use this narrative asa useful basis in which to personalise your feedback.…Focuses on student developmentSet assessments that are meaningful and authentic, and where possible allow students tochoose the focus <strong>of</strong> their assessment. Where individual choice is not practical, look to seehow you can use real case studies, data from real world examples or issues that are currentlyin the news. This should help to prevent students seeing an assessment task asmeaningless or as just a hurdle to jump over.Make sure that assessments do not encourage a surface approach to learning. Set assessmenttasks that explicitly require students to demonstrate integrative thinking, to linkBlended Learning In Practice May 2011


Assessment for learning in practice20concepts together and to go beyond regurgitation <strong>of</strong> your own lecture notes. The feedbackyou provide should recognise and reward this.…Ensures feedback leads to improvementSplit large assessment intosmaller assessments spreadacross the semester.Try splitting a large assessment into two smallerassessments. Make the first a draft task relatingto the second and ask the students to demonstratehow the feedback on the draft influencedtheir final submission.Don‟t just write comments that correct students‟ work but make sure that you includefeedback suggesting how they can improve in the future. Think about how you mightseparate out your corrective comments from your improvement comments so it is clear tothe students, e.g. using different colour inks, using symbols, using a feedback templatewith clearly defined sections.…Stimulates dialogueIn a lecture or seminar setting, after an assessment has been introduced to the students,ask them to write down one thing that they are confident about tackling in the assessmentand one thing that they still want clarification on and then to share in small groupsbefore feeding back to you.Post questions on module related discussion forums that ask students to describe anddiscuss the feedback they have received; try asking “in your own words, what did thefeedback mean to you?”, “what will you do differently on your next assessment as a result<strong>of</strong> this feedback?” or “what feedback didn‟t you understand?”.…Considers staff and student effortRather than set one or two large pieces <strong>of</strong> assessment towards the end <strong>of</strong> a semester,try setting a series <strong>of</strong> smaller assessments (fortnightly / monthly) across the semester todistribute the students‟ effort across the study period. Self, peer and technology supportedassessment will help make thismanageable.If you are using multiple choice questions,try asking your students to constructthe questions rather than doing ityourself. This enhances learning, savesyou time and you can use the questionsfor in class testing using Electronic VotingSystems or Optical Mark Readers, or toconstruct online quizzes in Studynet.“Post questions on module relateddiscussion forums that askstudents to describe and discussthe feedback they have received…”Blended Learning In Practice May 2011


21Peer Assessment: Educationally Effective...Peer Assessment: Educationally Effective and ResourceEfficientHelen Barefootand Fang Lou, Mark RussellLearning and Teaching Institute and School <strong>of</strong> Life Sciencesh.barefoot@herts.ac.ukAbstractPeer assessment was included within a Level 4 Human Physiology module at the <strong>University</strong><strong>of</strong> <strong>Hertfordshire</strong> following a periodic programme review during the academic year 2006-2007. The peer assessment exercise was thought to be beneficial in terms <strong>of</strong> studentlearning as it: engaged students explicitly with marking criteria; stimulated dialogue aroundassessment and feedback and ensured prompt feedback. It was beneficial for staff as itreduced the marking burden and enabled students to receive prompt feedback on theirwork.Performance on subsequent laboratory reports supported the argument that peer assessmentenhanced student learning and that the skills associated with data analysis and academicwriting can be transferred across modules. Comparison <strong>of</strong> student performance ona laboratory report (tutor assessed) submitted prior to the peer assessment activity, with alater submission <strong>of</strong> a laboratory report (tutor assessed) which took place after the peerassessment activity, demonstrated a statistically significant improvement in performanceon the second assignment (p


Peer Assessment: Educationally Effective...22<strong>of</strong> the subject. As with most Bioscience departments in the UK, the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>encourages the development <strong>of</strong> scientific understanding through the writing and assessment<strong>of</strong> laboratory reports (in addition to other assessment methods throughout theprogramme).ContextWithin the Biosciences programme at the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> (UH) all modules atlevel 4 (entry level Undergraduate) contain an assessment <strong>of</strong> at least one laboratory report.Reports tend to be approximately 1000-2000 words in length depending on the experiment.Prior to a redesign <strong>of</strong> the curriculum within a periodic review in 2007, studentsstudied a 15 credit point (cp) module in Human Physiology (Semester A). A significantpart <strong>of</strong> the assessment for the module (20%) was a laboratory report which assessedlearning outcomes associated with demonstrating knowledge and understanding <strong>of</strong>physiological processes and the interpretation <strong>of</strong> physiological data. All other moduleswithin the programme also used a laboratory report to assess learning outcomes associatedwith data interpretation and understanding <strong>of</strong> the subject material.One <strong>of</strong> the consequences <strong>of</strong> the Biosciences programme review was an increase in sizeand scope <strong>of</strong> the Human Physiology module from 15cp to 30cp (Semesters A and B).The curriculum review enabled the module team to consider and re-design the teachingmaterials, activities and assessment methods within the module. To enhance the educationaleffectiveness <strong>of</strong> the module the staff team designed a peer assessment activity(20% <strong>of</strong> the module mark) to replace the tutor marked laboratory report. The peer assessmenthad an additional benefit in that it reduced the staff marking burden which was important,as the programme and module had experienced an increase in student numbers.The first cohort completed the new Human Physiology module (30cp) (including the peerassessment activity) during the 07/08 academic year.Rationale for introducing peer assessmentWithin the „old‟ programme, despite completing a number <strong>of</strong> laboratory reports throughoutan academic year, there was little evidence to indicate that the scientific writing ability <strong>of</strong>students was improving over time. For example, when comparing student performanceon a laboratory report submitted within a Semester A module, with comparable submissionswithin a Semester B module there was little if any improvement in performance(figure 1). Appreciating that the subject discipline was different (human physiology [SemA] and biochemistry [Sem B]), the data suggest that student understanding <strong>of</strong> what wasrequired in a report had not developed, despite receiving written feedback from academicstaff and provision <strong>of</strong> the assessment criteria. Analysis <strong>of</strong> data using paired t-tests demonstratesno significant difference (ns; p>0.05) in performance between semester A submissionsand the semester B submissions.Blended Learning In Practice May 2011


Peer Assessment: Educationally Effective...23nsFigure 1: Mean student performance on laboratory reports submitted in Semester A andSemester B. (Data expressed as mean percentage ± standard error <strong>of</strong> the mean (SEM)as shown within the error bars).Peer assessment is a method that actively engages students in the marking process(Falchivok, 2004) and provides insight into what is expected within an assignment. It activelyengages students with the assessment criteria and helps clarify good performance(Rust et al., 2005). Expressing criteria which are understandable to students can be challengingand students may interpret criteria in different ways, possibly dependent on theirsocial and cultural background (Bloxham et al. 2004) as well as their previous experiencewithin the discipline. Distributing and/or explaining criteria to students may not be sufficientfor student understanding (Rust et al., 2005) yet active engagement with assessmentcriteria has been show to help students interpret the criteria and understand howthey will be assessed by tutors (Elwood and Klenowski, 2002).Peer assessment encourages dialogue around learning amongst students and promotesinteraction between staff and students (Gibbs and Simpson, 2004; Nicol and Macfarlane-Dick, 2005) as well as preparing students for lifelong learning, in that it provides opportunitiesfor students to give constructive feedback to peers (Orsmond, 2004). In addition tothe benefits that peer assessment brings to student learning and personal development,peer assessment can also reduce marking burdens for academic staff members, thusproviding an effective yet efficient assessment method.It was hoped that peer assessment would aid student understanding <strong>of</strong> the requirements<strong>of</strong> a scientific report, improve experimental data analysis and also reduce staff markingburdens, enabling feedback to be returned to students much quicker than was previouslypossible through tutor marking.Blended Learning In Practice May 2011


Peer Assessment: Educationally Effective...24Aims <strong>of</strong> the studyThe aim <strong>of</strong> the study was to determine if peer assessment was beneficial to studentlearning through consideration <strong>of</strong> student performance on subsequent laboratory reportsand their own self reflection.Method; Peer assessment processSince the academic year 2007-2008, students on the human physiology module (level 4)at UH have taken part in peer assessment <strong>of</strong> their laboratory reports. Each year between130-200 students registered on the module. The peer assessment activity formed part <strong>of</strong>the summative assessment for the module and contributed to 20% <strong>of</strong> the module grade.All students took part in a laboratory class lasting up to four hours, during which the experimentaldata was recorded and discussed. After all students had completed the laboratoryclass they attended a one hour workshop designed to provide insight and guidanceon the peer marking activity (figure 2).Aims <strong>of</strong> the preparatory workshop:To provide the rationale for peer markingTo discuss the benefits <strong>of</strong> peer assessmentTo describe the peer marking processTo provide guidance on laboratory report writingTo consider the specific marking criteria for the reportTo gather student perceptions regarding peer assessment and discuss anystudent concerns regarding the activity.Figure 2: Aims <strong>of</strong> the workshop that students undertook prior to the peer marking activityThe students then submitted two copies <strong>of</strong> their report; an online submission, to be usedfor staff moderation purposes, and a paper copy for distribution during the marking session.The peer assessment marking activity took place one week after submission <strong>of</strong> thereport and was conducted over a two hour period. Each student was randomly given anotherstudent‟s report (tutors ensured that no students received their own report) and thetutor guided the students through each section <strong>of</strong> the report according to detailed markingcriteria. Two/three additional tutors were available to answer any queries during themarking session. Each section (abstract, introduction, methods, result, discussion, references)was marked in turn and the students provided annotated feedback as well as allocatingmarks for each section. Reports were then returned to the original author to reviewthe mark and the comments. The results were recorded by the tutor.Blended Learning In Practice May 2011


Peer Assessment: Educationally Effective...25Moderation <strong>of</strong> scripts was carried out per UH academic quality processes, ensuring that any assignmentsdeemed to have failed (


Peer Assessment: Educationally Effective...26difference (p


Peer Assessment: Educationally Effective...27Figure 5: Student submission timelineStudent performance on the three laboratory reports can be seen in figure 6.Academic year <strong>of</strong> studyFigure 6: Mean student performance on the sequential laboratory reports submitted bylevel 4 students.To determine if student performance significantly improved within the IBMAP modulesince the inclusion <strong>of</strong> peer assessment within the HP module, data were analysed usingpaired t-tests (***=p


Peer Assessment: Educationally Effective...28Figure 7: Mean student performance on the laboratory reports submitted by level 4 studentswithin the IBMAP module (before and after completion <strong>of</strong> peer assessment <strong>of</strong> alaboratory report within the Human Physiology module).As can be seen in figure 7 above, student performance on the second laboratory report(Semester B) was significantly improved (p


Peer Assessment: Educationally Effective...29study student understanding would have improved as could their ability to analyse andreport scientific data. This argument isn‟t supported by early data however (figure 1), butunfortunately a direct comparison between „pre peer assessment performance‟ and„post peer assessment performance‟ within the IBMAP module cannot be made as thismodule did not exist prior to the periodic review.Other confounding factors must also be considered; the two assignments, althoughwithin the same module and assessing at the same level, were reports on different experimentsand may have had slightly different demands in terms <strong>of</strong> data analysis andsubject understanding. The pre submission support for both submissions may have differed(e.g. data analysis workshops, scientific writing guidance) and there were differentstaff members running the experiments and marking the reports which may have influencedstudent understanding. It would therefore be premature to conclude that peer assessmentalone resulted in the improved performance on the subsequent laboratoryreport.Enhancement <strong>of</strong> peer assessment processAlthough it was felt by tutors that the peer assessment activity was beneficial, and thedata suggested benefits, there was little evaluation <strong>of</strong> the student perception <strong>of</strong> theprocess. There were also concerns over the administrative burden for staff, caused byhigh numbers <strong>of</strong> student requesting moderation. During the first two years <strong>of</strong> peer assessmenta number <strong>of</strong> students (approximately 20%) complained to the tutor, immediatelyafter the marking session, that their mark was too low and that they wanted thestaff member to remark their work. The higher moderation requirements increased thestaff burden and nullified some <strong>of</strong> the time gains which the peer assessment activity hadbought (figure 8).Challenges associated with the peer assessment process:High moderation requirement for the teaching teamEven after the moderation, a number <strong>of</strong> students still questioned their marksfurther increase in workload <strong>of</strong> the module teamEvident lack <strong>of</strong> self reflection (developing self reflection was one <strong>of</strong> the intendedbenefits <strong>of</strong> the peer assessment)Figure 8: Challenges within the peer assessment processDuring the 2009-2010 academic year, the peer assessment activity was enhancedwithin the HP module and a web-based data gatherer was introduced to make the peerassessment process more effective and efficient. Through the data gatherer, studentsBlended Learning In Practice May 2011


Peer Assessment: Educationally Effective...30were encouraged to review their peer assessment feedback; to reflect on the experience<strong>of</strong> marking another student‟s work and to identify what they needed to do to improve futurelaboratory reports. They were also able to request remarking if they could indicate,against the marking criteria, where they felt they had been unfairly marked (either toohigh or too low).Introduction <strong>of</strong> a reflective questionnaireFollowing the marking activity, students were requested to reflect on their learning experienceby answering a web-based questionnaire containing 27 questions. A webbaseddata collection facility (data-gatherer) had been re-purposed to specifically supportthis activity (Russell, 2006). Five percent <strong>of</strong> the marks associated with the laboratoryreport were allocated to the reflection and feedback activity.The questionnaire contained eight questions which could be answered on a likert scale(SA= Strongly Agree, A= Agree, NAND = Neither Agree Nor Disagree, D = Disagree, SD= Strongly Disagree), two questions had radio buttons providing a choice <strong>of</strong> one out <strong>of</strong>five options (e.g. “which section <strong>of</strong> the report did you find most challenging?” options:abstract, introduction, methods results, discussion, references) and seventeen free textquestions enabling students to provide qualitative comments. Through the questionnaire,students were also asked to identify if they thought they had been over- or undermarkedduring the peer assessment activity. If they indicated that they had been mismarkedand wished to request moderation <strong>of</strong> their script, they had to specifically justifywhere they thought the mismarking had occurred, against the assessment criteria. Itwas hoped that this would limit the immediate requests for moderation which had beenexperienced in the previous two years.The questionnaire was open to students three days after the marking activity and the return <strong>of</strong>their assignments. The students had up to one week to complete the questionnaire.Quantitative findingsOver eighty percent <strong>of</strong> the students (82%) took the opportunity to reflect on their learningthrough completion <strong>of</strong> the online questionnaire (148 students out <strong>of</strong> a total <strong>of</strong> 181). Thevast majority <strong>of</strong> the students engaged very well in answering all 27 questions, and provideddetailed free text comments. As such, all but three students gained the full fivemarks allocated for feedback and reflection.The first part <strong>of</strong> the questionnaire used closed questions relating to reflection on the reportand the peer assessment process. A selection <strong>of</strong> results is provided.Blended Learning In Practice May 2011


Peer Assessment: Educationally Effective...31Figures 9 and 10: Student responses to the indicated statements. (SA= Strongly Agree,A= Agree, NAND = Neither Agree Nor Disagree, D = Disagree, SD = Strongly Disagree)Figures 9 and 10 indicate that the majority <strong>of</strong> students who answered the questionnaireexplicitly expressed that peer assessment was beneficial for their learning (77%) and thatas a consequence <strong>of</strong> the activity they feel better prepared for their next laboratory report(83%).80% <strong>of</strong> students also reported that the process <strong>of</strong> marking someone else‟s work wasbeneficial, particularly in terms <strong>of</strong> considering their own work in relation to someone else‟s(83%) (figures 11 and 12).Figures 11 and 12: Student responses to the indicated statements.Students also indicated the benefits <strong>of</strong> being engaged with the marking criteria prior towriting the report with over 83% either „strongly agreeing‟ or‟ agreeing‟ (figure 13).Blended Learning In Practice May 2011


Peer Assessment: Educationally Effective...32Figure 13: Student responses to the indicated statement.Qualitative findingsStudents provided free text comments in response to the question; “What other commentsdo you have regarding the peer assessment process?” There were some extremelypositive responses such as;“The marking criteria was a great help and provided a guideline for future labreports”“I felt that the marking scheme and even reading other peoples work made mereflect upon what I was good at and what I could work on/add to in my reportand future reports”“Good indication <strong>of</strong> what our lecturers would be looking for as well perhaps <strong>of</strong>how our peers may view our written work. Additionally from this exercise, I cangauge how effective my written communication skills are in the scientific fieldnorms “.Interestingly, even where some students expressed a negative opinion about the peerassessment activity, there was indication <strong>of</strong> student learning;“I don't feel that I benefited from the peer assessment activity, however, the detailedmarking scheme did help to show what the lecturers are looking for in futurereports”“It is quite a bit risky marking someone else's work but it was quite beneficial inunderstanding the criteria used in marking our laboratory reports”.Blended Learning In Practice May 2011


Peer Assessment: Educationally Effective...33Reductions in moderation burdenTo enhance the resource efficiency <strong>of</strong> the activity the students were asked to commenton their mark and indicate, with explicit reference to the marking criteria, where they believethey had been over, or under, marked. Sixteen reports were moderated using theonline submitted report. Extra marks were awarded to ten reports (9.4 ± 1.8 mean%, ±SEM). It was estimated that this saved approximately 25 staff marking hours (Lou et al.,2010).Discussion <strong>of</strong> student reflectionsThe introduction <strong>of</strong> the reflection element <strong>of</strong> the peer assessment provided greater insightinto student opinions on the activity, and further support for the argument that peerassessment benefits student learning.As considered above, the data suggest that that the peer assessment activity supportedthe development <strong>of</strong> scientific writing skills that students were able to transfer acrossmodules. However, compounding factors prevented this from being a definitive conclusion.The reflections from students do, however, support the argument that peer assessmentbenefited their learning through their increased understanding <strong>of</strong> what is requiredwithin laboratory reports and also indicated that they felt better prepared for futurereports.Providing students with a structured opportunity to reflect on their feedback and marktogether with the consideration <strong>of</strong> their experiences <strong>of</strong> marking someone else‟s report,reduced the moderation burden for staff. Previously, a number <strong>of</strong> students(approximately 20%) had approached staff immediately after the marking session tocomplain about being marked unfairly and requesting staff re-marking. The complaintswere <strong>of</strong>ten related to one or two marks within a specific section and it was evident fromthe immediacy <strong>of</strong> the complaints that the students had not reflected effectively on theirown work in relation to the assessment criteria and the feedback they had received.The structured reflection via the data gatherer, reduced these challenges, as studentswere given time to consider their mark and the comments in detail. Requests for moderation<strong>of</strong> marks had to be explicitly justified according to the assessment criteria andonly 9% <strong>of</strong> the students made a case for additional moderation.ConclusionAnalysis <strong>of</strong> the results <strong>of</strong> peer assessment incorporated into a level 4 Human Physiologyat the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> demonstrates the potential to engage students inproductive learning. Qualitative and quantitative findings indicate that students benefitedfrom being actively engaged with assessment criteria, both prior to the writing <strong>of</strong> thelaboratory report and during the marking process. The peer assessment activity helpedBlended Learning In Practice May 2011


Peer Assessment: Educationally Effective...34students to better understand the requirements <strong>of</strong> scientific writing; skills which they wereable to transfer to other modules within their programme <strong>of</strong> study. The peer assessmentactivity also provided prompt and relevant feedback as all students had their markedscripts returned to them during the marking activity which took place one week after submission<strong>of</strong> the report. The marking session certainly stimulated dialogue between staffand students, as well as between peers. This encouraged deep learning and hopefully abetter understanding <strong>of</strong> the subject material. All <strong>of</strong> these benefits are recognised withinprinciples <strong>of</strong> good assessment and feedback (Gibbs and Simpson, 2004; Nicol andMacfarlane-Dick, 2005).The introduction <strong>of</strong> the data gatherer in 2009-2010, enabled students to reflect more effectivelyon their learning. Through the comparison <strong>of</strong> their own work with someone else‟sand against the marking criteria, they felt better prepared for future assignments. Thepeer assessment process not only has the potential to benefit student reflection andlearning but also provides tangible benefits to staff in terms <strong>of</strong> reducing the marking burden.The peer assessment activity adopted at UH thus demonstrates educational effectiveness,together with resource efficiency; two things all busy academics must be strivingfor within their teaching and assessment.References:Bloxham, S., and West, A. (2004). Understanding the rules <strong>of</strong> the game: marking peerassessment as a medium for developing students‟ conceptions <strong>of</strong> assessment. Assessmentand Evaluation in Higher Education, 29(6), 721-733.Davies, P. (2006). Peer-assessment: judging the quality <strong>of</strong> students work by commentsrather than marks. Innovations in Education and Teaching International, 43(1), 69-82.Elwood, J., and Klenowski, V. (2002). Creating communities <strong>of</strong> shared practice: assessmentuse in learning and teaching. Assessment and Evaluation in Higher Education, 27(3), 243-256.Falchikov, N. (2004). Improving assessment through student involvement: practical solutionsfor learning in higher and further education. New York: Routledge Falmer.Falchikov, N., and Goldfinch, J. (2000) Student peer assessment in Higher Education: ameta-analysis comparing peer and teacher marks. Review <strong>of</strong> Educational Research, 70,287-322.Ferguson, G., Sheader, E. and Grady, R. (2008). Computer-Assisted and Peer Assessment:A combined Approach for Assessing First Year Laboratory Practical Classes forLarge Numbers <strong>of</strong> Students. Bioscience Education. 11:4Blended Learning In Practice May 2011


Peer Assessment: Educationally Effective...35Gibbs, G., and Simpson, C. (2004). Conditions under which assessment supports studentslearning. Learning and Teaching in Higher Education, 1, 3-31.Hughes, I.E. (2005). Assessment for Learning, The Higher Education Academy Centrefor Bioscience. [Presentation 13/04/2005, <strong>University</strong> <strong>of</strong> Reading]Lou, F., Barefoot, H., Bygate, D., and Russell, M. (2010). Using technology to make anexisting assessment more efficient. [Poster presentation at the International BlendedLearning Conference]Lui, C., and Tsai, C. (2005). Peer-assessment through web-based knowledge acquisition:tools to support conceptual awareness. Innovation in Education and Teaching International,42(1), 43-59.Nicol, D.J., and Macfarlane-Dick, D. (2005). Formative assessment and self-regulatedlearning: a model and seven principles <strong>of</strong> good feedback practice. Studies in Higher Education,31(2), 199-218.Orsmond, P. (2004). Self- and peer- assessment: guidance on practice in the biosciences.In Teaching Bioscience Enhancing Learning Series, eds S. Maw, J. Wilson, andH. Sears, pp.1-47 Leeds, The Higher Education Academy Centre for Bioscience.Pope, N. (2005). The impact <strong>of</strong> stress in self- and peer-assessment. Assessment andEvaluation on Higher Education, 25(4), 341-352.Quality Assurance Agency (QAA). (2007). Subject benchmark statement: Biosciences.Accessed via: http://www.qaa.ac.uk/academicinfrastructure/benchmark/statements/Biosciences07.asp#preface [15.4.11]Russell, M. (2006). Evaluating the weekly-assessed tutorial sheet approach to assessment:the students' experience. Journal for the Enhancement <strong>of</strong> Learning and Teaching,3(1), 37-47.Rust, C., O‟Donovan, B., and Price, M. (2005). A social constructivist assessment processmodel: how the research literature shows us this could be best practice. Assessmentand Evaluation in Higher Education, 30(3), 231-240.Topping, K.J. (1998). Peer-assessment between students in colleges and universities.Review <strong>of</strong> Educational Research, 68(3), 249-276.Zevenbergen, R. (2001). Peer-assessment <strong>of</strong> student constructed poster: assessmentalternatives in perspective mathematics education. Journal <strong>of</strong> Mathematics Teacher Education,4, 95-113.Blended Learning In Practice May 2011


Applications <strong>of</strong> EVS36APPLICATIONS OF EVSKaren Robins,Business SchoolK.robins@herts@ac.ukHow did you use EVS in your teaching?For seven weeks over a semester, an EVS drop quiz for summative feedback was givenin the lecture. This best five quiz results were then used as part <strong>of</strong> their assessment andworth 10% <strong>of</strong> the overall module mark. Each quiz had six questions and had a mix <strong>of</strong>both qualitative and quantitative questions. The information being tested in the quiz wasfrom the previous week‟s lecture to give students time to absorb the lecture informationand prepare. MCQs were created from a bank <strong>of</strong> questions that had been written byprevious students, on the same module, as part <strong>of</strong> their assignment.A weekly league table was uploaded to StudyNet to show how students were progressing.The results were shown by student registration number rather than name. Studentsachieving more than 70% were highlighted in green, from 40% - 70% in amber and below40% in red. Comments were also attached next to the mark on the league table.Why did you choose to use EVS in this context?Students are familiar with the use <strong>of</strong> multiple choice questions to test their knowledgeand find the technology easy to use. From personal experience <strong>of</strong> using EVS in previousmodules, it was clear that students enjoyed using it for formative feedback. EVSalso appears to engage all students and not just the good students. It is easy to developan EVS test using TurningPoint EVS s<strong>of</strong>tware, as it provides an extra „tab‟ in Power-Point to create multiple choice questions and capture student responses.What were the benefits <strong>of</strong> your application <strong>of</strong> EVS for you and your students? To provide academic staff with immediate feedback regarding the misunderstanding<strong>of</strong> concepts and ideas being communicated EVS provides instant formative feedback to students, which may also be directlyrelevant to summative assessments EVS encourages a high level <strong>of</strong> interaction within lecture sessions EVS use prompted increased attendance (average 84%) and participation in lectures If student responses showed a misunderstanding, concepts could clarified immediately Experience showed that students would put in extra effort even for a small proportion<strong>of</strong> marks, thereby encouraging a learner centred approach It is possible to identify early in the semester which students need help and put insupport mechanisms Students do not feel threatened or „stupid‟ if they do not know the answer, as theresponses are anonymous to their peers. International students who do not normally participate much in lectures are happyto submit their answers using EVSBlended Learning In Practice May 2011


Applications <strong>of</strong> EVS37Did you experience any drawbacks or problems with using EVS that may promptyou to modify or develop the use outlined above?It took several weeks for students to collect their handsets, despite frequent remindersand 9-5 opening <strong>of</strong> the <strong>of</strong>fice. Handsets have to be distributed from this <strong>of</strong>fice as thehandsets needed to be registered to a student to enable their scores to be used forsummative assessment and to ensure return <strong>of</strong> handsets at the end <strong>of</strong> Semester B sothat they can be used by level 4 students in 2011-12.The report format provided by TurningPoint is not intuitive. Many different reports areprovided and perhaps there is too much choice?Some student handsets did not appear to work, although it could be the students tryingto get round having to do the quiz. If this happened, the students were requested to returnthe handset and get a new one. The downside <strong>of</strong> this was that students then hadtwo handset numbers attached to their name and updating the marks for summativefeedback took extra time.If students forgot to bring the handsets, it was not possible to capture their marks.Future uses <strong>of</strong> EVSAttendance monitoring is required for level 4 and 5 students and may be required forpostgraduate students in future. EVS could easily capture this information.EVS could be used to capture student feedback to help module leaders improve theirmodules in future.Blended Learning In Practice May 2011


Applications <strong>of</strong> EVS38How did you use EVS in your teaching?We have been using EVS across the first year BSc Computer Scienceundergraduate cohort. Over 300 students have been issuedwith handsets and they are being used in a variety <strong>of</strong> ways by differentacademics.Why did you choose to use EVS in this context?Overall, we wanted to encourage greater engagement with learningamong our first year students.What were the benefits <strong>of</strong> your application <strong>of</strong> EVS foryou and your students?Amanda Jefferies,School <strong>of</strong> Computer ScienceAcademics have been able to experiment through this semester with new ways <strong>of</strong> <strong>of</strong>feringfeedback and with new styles <strong>of</strong> assessments especially „low stakes‟ and formativeways <strong>of</strong> asking students for their input. Although it was a new technology to many colleagues,there has been an enthusiasm to trial new ways <strong>of</strong> providing prompt feedbackto students on their understandingDid you experience any drawbacks or problems with using EVS that may promptyou to modify or develop the use outlined above?In some rooms the hardware and s<strong>of</strong>tware has yet to be installed which has meant takingalong the specialist receiver and in some cases an extra laptop with the s<strong>of</strong>twarealready loaded. Colleagues have got round this difficulty by using a handheld receiverwith smaller groups. This was already owned by the School, since our Change Academyfor Blended Learning Enhancement (CABLE) project experimented with MSc studentsusing EVS.Blended Learning In Practice May 2011


Applications <strong>of</strong> EVS39Jackie Willis,School <strong>of</strong> Life Sciences,Health & Human SciencesHow did you use EVS in your teaching?EVS was used for teaching mathematics and statistics to bioscientists,firstly in a diagnostic maths test, followed by workshops.Why did you choose to use EVS in this context?I was dealing with a large student cohort (about 180 students in one session). It wasimportant to establish that the students were following the explanations and able toprocess the information correctly. EVS was the only means <strong>of</strong> checking that all <strong>of</strong> thestudents were engaging and had understood the material.What were the benefits <strong>of</strong> your application <strong>of</strong> EVS for you and your students?During the session students were able to confirm that they had the correct answer to theproblem, so feedback was immediate. This provided the opportunity to run through explanationsagain if necessary. For the diagnostic test, I was also able to capture the responsesfor each individual student using the quiz feature <strong>of</strong> Turning Point. At the end<strong>of</strong> the test, the marks were easily downloaded into an Excel spreadsheet and publishedon Studynet using just the Student ID number. This also meant that I could allocate studentsto workshops that met their learning needs, based on how they scored in eachsection <strong>of</strong> the test. Without EVS I would have marked the test by scanning OMR sheetswhich would have taken a few hours <strong>of</strong> my time instead <strong>of</strong> just a few minutes.Did you experience any drawbacks or problems with using EVS that may promptyou to modify or develop the use outlined above?No, but I have found that instead <strong>of</strong> just using multiple choice questions for the answersI can just as easily ask for the numeric answer to be input, adding a new dimension tousing EVS.Blended Learning In Practice May 2011


Applications <strong>of</strong> EVS40Iain Cross,School <strong>of</strong> Life SciencesHow did you use EVS in your teaching?1. Regular drop-quizzes on materials taught both during thelecture and in previous sessions.2. As an ice-breaker to a subject and to ascertain existing understanding<strong>of</strong> new topic material.3. Answering questions generated by students‟ peers in smallgroupsAs a practice technique for answering MCQs.Why did you choose to use EVS in this context?For each <strong>of</strong> the above...1. For improving memory and recall <strong>of</strong> key facts; to understand how students‟ knowledgeis developing and identify areas for additional teaching or revision support.2. To overcome stigmas <strong>of</strong> responding in public to questions and remove fear <strong>of</strong> giving„wrong‟ answer.3. To encourage students to reason what would be an incorrect answer to a questionas well as the right answer; developing a deeper understanding <strong>of</strong> the material & helpindentify differences between potentially similar answers.Overcoming responding stigma, provides ideal opportunity to explain incorrect answers;students appreciate their own performance to prioritise revision subjects.What were the benefits <strong>of</strong> your application <strong>of</strong> EVS for you and your students? Helps me to identify students knowledge to adjust future teaching and requirementsfor support. Helps students appreciate their own performance Communicates clearly the expected level <strong>of</strong> knowledge required for success in examinations. Removes barrier <strong>of</strong> embarrassment, stigma etc when responding in public; improvesstudent engagement with material. Adds an element <strong>of</strong> „fun‟ to lectures – novel technology and <strong>of</strong>ten fun and amusement(!)when discussing answers. Breaks up lengthy lectures into shorter segments.Did you experience any drawbacks or problems with using EVS that may promptyou to modify or develop the use outlined above? Maintaining student focus when discussing answers – they may become more interestedin the range <strong>of</strong> responses than on understanding why their selection was correct/incorrect.Blended Learning In Practice May 2011


Applications <strong>of</strong> EVS41 Students who have forgotten EVS handsets may become isolated from sessions. Occasional difficulties in receiving votes. More advanced uses <strong>of</strong> EVS (e.g. for summative assessments, attendance monitoringetc.) could be unappealing as further training may be required. Time spent initially preparing slides can be significant. If not used with care there is a risk <strong>of</strong> trivialising subjects into simple facts. Reliance on an old laptop computer which is underpowered for EVS s<strong>of</strong>tware.Click image below for Iain‟s video on EVSBlended Learning In Practice May 2011


What is the innovation beyond the….42Jenny LorimerSchool <strong>of</strong> Health & Emergency Pr<strong>of</strong>essionsHow did you use EVS in your teaching?An electronic voting system (EVS) was used to enable conditionalbranching at key decision-making points during a series<strong>of</strong> imaging scenarios, where a conditional branch (in computerprogramming) is an instruction that directs the computer to anotherpart <strong>of</strong> the program based on the results <strong>of</strong> a compare. Inpractice conditional branching refers to the application <strong>of</strong> theEVS to allow the student group to control the order <strong>of</strong> Power-Point slides in a presentation, based on the responses received to posed questions.Why did you choose to use EVS in this context?EVS was used to provide immediate, relevant feedback to students in a non-threateningmanner while enhancing the student experience in a large cohort that is diverse both interms <strong>of</strong> age and ethnicity. A project was used to investigate the feasibility and suitability<strong>of</strong> embedding conditional branching as an integral teaching tool for a large groupthroughout an undergraduate module.What were the benefits <strong>of</strong> your application <strong>of</strong> EVS for you and your students?One <strong>of</strong> strengths <strong>of</strong> conditional branching was found to be the ability to link the imagingscenarios closely to clinical practice. Relevant images were included at key decisionmaking points and subsequent explanations were described to the group. Studentswere able to experience the effect <strong>of</strong> how an inappropriate decision could have a variety<strong>of</strong> different adverse outcomes for a patient.During the teaching, the focus <strong>of</strong> the conditional branching was as a tool for helping thestudents to understand topics as a whole and as an opportunity to develop their criticalthinking and decision making skills. Prior to each conditional branching session the studentshad had a number <strong>of</strong> teaching sessions on a given topic area. It was clear thatthroughout the teaching the conditional branching was a successful method <strong>of</strong> increasingstudent engagement and classroom interactivity, despite the large size <strong>of</strong> the group.Although students had individual handsets there was continued evidence <strong>of</strong> student collaborationand social learning as students discussed options with small groups <strong>of</strong> theirpeers prior to voting.Did you experience any drawbacks or problems with using EVS that may promptyou to modify or develop the use outlined above?The lack <strong>of</strong> right or wrong answers in the clinical scenarios is initially alien compared tothe more common use <strong>of</strong> EVS and therefore required a level <strong>of</strong> reassurance. The mainchallenge was the length <strong>of</strong> time that the preparation <strong>of</strong> the scenarios took, a key factorbeing the need to reduce the complexity <strong>of</strong> real life cases into a simplified format thatcould be used and understood more easily as teaching examples. One <strong>of</strong> the complexi-Blended Learning In Practice January 2011


What is Teaching?43ties <strong>of</strong> the concept was that once students had decided a route by majority decisionthere was no easy way to go back and review alternative routes. This was overcome byadding summary slides at the end scenario. Whatever decision had been made, andwhatever patient pathway had been taken, the students were still exposed to all <strong>of</strong> thealternatives. In this way, they were able to understand when the group had made aninappropriate choice, the reasons for it being inappropriate, and the slide could be revisitedpermitting the students to review the question in the light <strong>of</strong> new knowledge.Click image below for Jenny‟s video on EVSBlended Learning In Practice January 2011


The creation <strong>of</strong> group wikis ...44The creation <strong>of</strong> group wikis by first year undergraduatephysiotherapy students.Sue Roscoe<strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>s.l.roscoe@herts.ac.ukAbstractThe control <strong>of</strong> infections in hospitals and other health care settings is currently animportant issue. The Physiotherapy Practice Team at the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>devised a group wiki assessment for first year undergraduate physiotherapy students toevaluate infection prevention issues experienced during a five week placement. BothPractice Team staff and students received hands-on training in wiki technology.Students also received a compulsory session on infection prevention issues. In groups<strong>of</strong> 6-8, the students used wikis to record their analysis <strong>of</strong> specially devised videosrelating to infection prevention. They also considered their own practice and that <strong>of</strong>others during the placement and kept a reflective log on the wiki which recorded theirown personal thoughts on the whole process. Post assessment evaluation wasaugmented by use <strong>of</strong> a student questionnaire. Initial findings suggest that this method <strong>of</strong>facilitating learning in the practice environment is successful. Considerations forimprovement <strong>of</strong> the activity have centred around the timing <strong>of</strong> the pre placementassessment introductory lecture, the use <strong>of</strong> a formative activity and further informationfor students on the nature <strong>of</strong> remote collaborative working.IntroductionIn recent years university academics have been encouraged to revisit the teaching andassessment methods they <strong>of</strong>fer undergraduate students. The shift in the way some universityprogrammes now facilitate and assess learning may in part be due to developmentsin the understanding <strong>of</strong> the way students learn (Honey and Mumford, 1992). ThePr<strong>of</strong>essional Standards Framework (PSF) encourages higher education teachers tohave knowledge <strong>of</strong> how students learn and make use <strong>of</strong> appropriate learning technologies(Higher Education Agency, 2007). As a result, a variety <strong>of</strong> innovative assessmentmethods are now being used which enable students to be assessed in more contemporaryways.Traditional methods <strong>of</strong> assessment for academic programmes have focused on the end<strong>of</strong> term written examination and within Physiotherapy this has typically been augmentedwith practical assessment <strong>of</strong> techniques. In June 2007, first year undergraduate physiotherapystudents at the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> undertook their first 5 week practiceplacement at the end <strong>of</strong> year one. Their clinical practice was assessed by Practice Educatorsand this was supplemented by an additional piece <strong>of</strong> academic work set by the<strong>University</strong> in which students were required to create a written pr<strong>of</strong>orma that detailed acritical incident in the „infection prevention‟ aspect <strong>of</strong> their work. This particular assessmentwas produced as an individual piece <strong>of</strong> work. Students were very much alone intheir learning and collaborative study between the students was minimal.Blended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...45The Practice team felt that the learning outcomes for this piece <strong>of</strong> work could equally bemet by the creation <strong>of</strong> a group „wiki‟. The word „wiki‟ is Hawaiian for „quick‟ and in aneducational context refers to an online resource (web site) that allows users to add andedit content using their own web browser. It was felt that wiki use would encourage collaborativeworking, reflective practice and communication between students. Many studentswere alone on placement, therefore wiki engagement would also facilitate socialsupport and allow contact with the <strong>University</strong> on a regular basis.Current students are immersed in electronic opportunities to communicate and this isevident in the explosive use <strong>of</strong> social networking sites such as „facebook‟. The shift,therefore, from paper based assessment to wiki creation and assessment was perceivedas being positive for the students by the <strong>University</strong>‟s Practice Team and a decisionwas made to embed a wiki into the infection prevention assessment part <strong>of</strong> thepractice experience for a cohort <strong>of</strong> 64 first year students .BackgroundDevelopments within the field <strong>of</strong> Information and Communication Technology (ICT) initiallyprovided Higher Education Institutions (HEIs) with the ability to provide vast resources<strong>of</strong> materials for their students (HEFCE, 2005). Further developments have enhancedthe students‟ ability to be independent in their learning and share informationwithin their student bodies.One measure <strong>of</strong> success <strong>of</strong> a HEIs ability to embed e-learning into the curriculum, asdescribed by HEFCE (2005:9), is that „students are able to access information, tutorsupport, expertise and guidance, and communicate with each other effectively whereverthey are.‟ This is particularly important for the Physiotherapy undergraduate degreeprogramme as placement learning is an integral part <strong>of</strong> the course. Students must passa total <strong>of</strong> 1000 hours in clinical practice in order to register with the Chartered society <strong>of</strong>Physiotherapy (CSP) and be eligible to practice within the National Health Service(NHS) (CSP, 1997). Embedding e-learning into the curriculum enables students to notonly engage with their placement learning activities in an enhanced way, but also to accessresources remotely and continue to liaise with each other and with their academicstaff.Miers (2004) suggests that learning should, in part, be „cooperative, collaborative, andconversational‟ and Wiki development certainly encourages these characteristics. One<strong>of</strong> the features <strong>of</strong> constructivist learning is the inclusion <strong>of</strong> reflective practice and wikismeet this need as they enable reflection to be experienced and shared collaboratively.Gibbs (1998), in his reflective cycle, suggests that the beliefs and feelings <strong>of</strong> other peopleare important in the reflective process and sharing their thoughts via a wiki and providingfeedback on reflection by peers enables this reflective process to evolve transparentlyand in real time.The idea that group working enhances individual learning is not new and methods <strong>of</strong>achieving this within an increasingly diverse academic structure and student populationBlended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...46have emerged with the development <strong>of</strong> new technology. Parker and Chao (2007) see awiki as „a web communication and collaboration tool that can be used to engage studentsin learning with others within a collaborative environment‟. This collaborative elementis easily enabled by wiki technology and especially well suited to those studentsundertaking placements who are unable to meet face-to-face.Wikis are based on the idea that virtual connectivity is useful in enhancing knowledgeand the learning ability <strong>of</strong> the individuals engaging with it (Alexander, 2006). As such,wikis have been described as part <strong>of</strong> a collection <strong>of</strong> social s<strong>of</strong>tware. Their function isnot, however, <strong>of</strong> sole benefit for the creators <strong>of</strong> the work; wikis can expand into a permanentsource <strong>of</strong> information and knowledge for others. „Wikipedia‟, the free on-line encyclopaedia,is one such well-known repository.In „pre-wiki‟ times, group on-line activity consisted <strong>of</strong> group members emailing a documentback and forth between them and attempting to create a final edited version to reflectthe group activity. Wiki functionality now enables group members to create and editone document on a single web page (Duffy and Bruns, 2006).Wiki technology encourages peer interaction and group work, and the sharing <strong>of</strong> knowledgeand experience amongst the users is what makes it such an effective collaborativetool. Johnson and Johnson (1986) suggest that students working cooperatively achievehigher levels <strong>of</strong> thought and have enhanced memory <strong>of</strong> the learning activity comparedwith students who work individually. Chickering and Gamson (1987) support this ideaand believe that it is the sharing <strong>of</strong> individual ideas and discussion <strong>of</strong> those <strong>of</strong> othersthat deepens understanding. One <strong>of</strong> the ultimate aims <strong>of</strong> higher education must be t<strong>of</strong>oster a desire for lifelong learning in its students. Enabling students to work collaboratively,especially remotely, embeds some <strong>of</strong> the skills necessary for them to developlearning communities for the future, a vision shared by the Higher Education Authority(HEA, 2007).Although there is a wealth <strong>of</strong> literature which supports wiki use, Wang and Turner(2005) suggest that there are challenges associated with this particular tool. For examplecontent is modifiable by any user, yet there may be pages which the tutor intendedto remain in their original form. To prevent users from overriding each other, pagelockingmechanisms operate and simultaneous edits are not facilitated. Boulos,Maramba and Wheeler (2006) also suggest that plagiarism is a potential threat. Theposting <strong>of</strong> previously copyrighted material without the authors consent is viewed as academicmisconduct and constant vigilance is needed by the moderators and academicsoverseeing wiki creation by students. Similarly, constant editing <strong>of</strong> material may reducethe accuracy or truths associated with wiki postings (Boulos, Maramba and Wheeler,2006)Blended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...47Use <strong>of</strong> a Wiki to facilitate and assess learning in the Practice environment.Physiotherapy students at the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong> were assigned into groups <strong>of</strong> 4-8 in order to create a wiki on the topic infection control. Each group had a mix <strong>of</strong> placementspecialities and students from different placement locations within it.In terms <strong>of</strong> staff preparation for the wiki assessment, a member <strong>of</strong> the Practice Teamreceived wiki training from the Learning and Teaching Institute at the <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>.This enabled staff to experience firsthand some <strong>of</strong> the challenges the studentsmight face when engaging with wiki technology. The same member <strong>of</strong> the physiotherapyPractice Team was also responsible for the creation and organisation <strong>of</strong> trainingvideos with the help <strong>of</strong> staff members across the School <strong>of</strong> Health and Emergency Pr<strong>of</strong>essions.The videos featured actors playing the parts <strong>of</strong> physiotherapists, radiographers,patients and other health care personnel in various different clinical settings. Thevideos contained scenarios that demonstrated some areas <strong>of</strong> bad practice with regardsto infection prevention, but also examples <strong>of</strong> good hand washing technique and use <strong>of</strong>alcohol gel. Reference was made to the National Institute for Clinical Excellence Guidelinesfor Infection Control (2003) when researching good clinical examples. The videoswere located on the <strong>University</strong>‟s virtual learning environment, StudyNet, for remote accessby students on placement.To prepare the students for the use <strong>of</strong> the wiki in their assessment, the students attendeda compulsory one hour lecture on infection prevention in May 2008, 3 monthsbefore the placement was due to start. This covered types <strong>of</strong> infection, the scale <strong>of</strong> theproblem, how infections spread and the nature <strong>of</strong> infection prevention.Following this was a one hour session in a technology suite at the <strong>University</strong> wheretheoretical and „hands-on‟ wiki training was undertaken by all the students. A wiki oninfection control was to be produced by each group. Each was to contain core informationabout infection and infection control, analysis <strong>of</strong> the videos relating to infection control,personal reflection pages considering wiki use and placement experiences, andgroup recommendations for improved practice within their practice placements. Eachgroup was asked to create a title for their wiki and then consider ground rules that definedtheir group working etiquette without stifling creativity <strong>of</strong> thought. Ground rulesaround use <strong>of</strong> discussion sites, distribution <strong>of</strong> content and response times were common.The students were asked to work together on their wiki, whilst they were on placement,to identify and explain the bad practice they had observed within the video and how itcould contribute to the spread <strong>of</strong> infection. They also had to recommend how practicecould be improved with reference to the evidence base. Each student was also requiredto highlight areas <strong>of</strong> good and bad practice from their specific placement and thesecomments then needed to be combined to produce a common source <strong>of</strong> information.The wikis needed to conclude with a list <strong>of</strong> recommendations made in response to thevarious placement experiences. Each group member was also asked to maintain a re-Blended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...48flective page within the wiki where they would post personal notes, reflections on theplacement that related to infection prevention and reflections on the experience <strong>of</strong> contributingto a wiki. These reflective pages were then to be printed <strong>of</strong>f at the end <strong>of</strong> theassessment to be included in the students‟ pr<strong>of</strong>essional development folders.The importance <strong>of</strong> maintaining anonymity <strong>of</strong> the placement sites and personnel hostingstudent placement education was stressed to the students during the introductory workshopon wiki use and the need to adhere to the faculty confidentiality policy was highlighted.During the placement period the Practice Team monitored wiki use and wereable to help with any technical issues which arose either by direct observation or by individualrequests for help by email, telephone or StudyNet communication.The wikis were designed to be closed environments. Therefore, groups could only seeand edit their own wiki until the date chosen for them to „go live‟ thereby helping to preventpotential plagiarism issues. After this the wikis were viewable by other studentswithin the teaching group but not the general public.When the students returned to the university after their placements in September 2008they presented their wikis to the rest <strong>of</strong> their cohort and also submitted a short reflectiveevaluation <strong>of</strong> their personal experiences <strong>of</strong> using this particular technology. In addition,they completed a questionnaire about their wiki experiences from which commonthemes were identified to help inform staff <strong>of</strong> any future assessment developmentswhich may be needed.Programme tutors from the School and the Faculty Inter-pr<strong>of</strong>essional Learning Coordinatorwere informed <strong>of</strong> the project and gave their approval. Practice educators wereinformed about the assessment by letter and contact details <strong>of</strong> the Practice Team weredisseminated to them should further information have been needed.Assessment outcomeAfter the placement, groups had one final meeting in order to complete their work. Thewikis were then shared in a „Best Bits‟ showcase where all the students from the cohortmet together to see all the final wikis. This was managed by a member <strong>of</strong> the PracticeTeam. All groups finished the set task on time and so by the submission date therewere 10 completed wikis.Each group wiki was assessed by the Lead Tutor for this piece <strong>of</strong> work and feedbackwas provided in the following areas:Wiki layout and appearanceIntroductionVideo AnalysisBlended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...49Placement reflectionsReferencingIndividual students were acknowledged where it was evident that they had contributedfar more than other students within the groups. The wiki history enables contributions tobe monitored so this was easy to see. All groups passed the wiki assessment.A post assessment questionnaire was issued at the Showcase event and completed by58 students. It contained 7 questions and was devised by the Practice team memberresponsible for leading the assessment activity. Information was requested which highlightedthe amount <strong>of</strong> time the student had spent editing the wiki during the placementand when in the day this work was undertaken. Information about the barriers whichchallenged the student‟s contribution to the wiki, the positive learning experiencesgained and suggestions for improving the learning experience were also requested.Results <strong>of</strong> the QuestionnaireKey findings <strong>of</strong> the questionnaire are reported below. Figures 1 and 2 show the level <strong>of</strong>engagement the students recorded in working on their wikis.Figure 1. Weekly time commitment to wiki work.Blended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...50Figure 2. Daily time allocation for wiki workFigures 3 and 4 show the barriers to wiki working and ways in which wiki engagementwas felt to enhance learning while figures 5 to 7 summarise a variety <strong>of</strong> student opinionson the use <strong>of</strong> wikis in this context..Figure 3. Perceived barriers which students reported as influencing their contribution tothe wiki activityBlended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...51Figure 4. Student opinions on the educational benefits <strong>of</strong> wiki work.Figure 5. Student concerns about using wiki as a group working activity.Blended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...52Figure 6. Students‟ positive opinions <strong>of</strong> using a wiki for group work activity.Figure 7. Student suggestions for improvement to the wiki activityBlended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...53Assessment evaluationThe aim <strong>of</strong> the assessment was for students to gain knowledge and understanding <strong>of</strong>infection prevention. This knowledge was demonstrated by all wiki groups in the final„showcase‟ at the university. Students were also expected to develop in ways whichwould enable them to compare theory and practice, reflect on their own practice andthat <strong>of</strong> others, write collaboratively and improve their generic information technology (IT)skills. The tutor was able to give detailed written feedback to each group that not onlyevaluated the final product, but the skills mentioned and the contribution and timeliness<strong>of</strong> edits by each contributor.The wiki training received by the staff was sufficient for them to be able create and supportthis assessment activity and only 2% <strong>of</strong> students reported that technical difficultieswere a barrier to wiki working (Figure 3). The university infrastructure was also able tosupport the production <strong>of</strong> the videos which were analysed by the students within thewiki.The results <strong>of</strong> the questionnaire showed that 57% <strong>of</strong> students spent between 1-3 hoursper week editing their wiki while they were on placement (Figure 1). This was deemedby the academic staff to be an appropriate amount <strong>of</strong> time for this activity. It was noted,however, that 31% <strong>of</strong> students spent less than one hour per week on their wiki editing(Figure 1)but due to the anonymous nature <strong>of</strong> the questionnaire it was not possible torecover their wiki contributions to look for skills levels in these particular students.It was encouraging to see that 38% <strong>of</strong> students did most <strong>of</strong> their work whilst on placementbut after hours (Figure 2). Physiotherapy students‟ placement education should bepatient centred and time should not be spent on this type <strong>of</strong> assessment during theworking day. It was concerning to see, however, that 26% <strong>of</strong> students did not engage atall in the wiki until their placement had finished (Figure 2). despite being asked to contributethroughout the 5 weeks <strong>of</strong> their placement. This may well explain the complaintsthat 28% <strong>of</strong> students made about inequality <strong>of</strong> workload within the groups (Figure 3).The students reported that the main challenges to completing the assessment werethose <strong>of</strong> reduced internet availability, and that the placement learning workload/experience was too consuming to leave much time or energy to engage with the wikidevelopment (Figure 3). It is acknowledged that those students working hard within abusy department may well be tired at the end <strong>of</strong> the day and, if coupled with a long journeyhome, it is understandable that this would affect wiki engagement. However, thenature <strong>of</strong> the remote working afforded by the wiki does enable those students to work attimes which suit them better, over weekends for example. This is supported by 25% <strong>of</strong>the students suggesting that the best thing about using the wiki was the fact that it didenable remote working (Figure 6).When looking at what could be improved for the next cohort <strong>of</strong> students it was interestingto see that 25% <strong>of</strong> the students felt that there should be more wiki training and someBlended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...54<strong>of</strong> the comments suggested that there was minimal training (Figure 7). However, thepresence <strong>of</strong> a PowerPoint presentation and associated supportive literature which wasposted onto StudyNet after the training session suggests that this was not the case. It isperhaps due to the timing <strong>of</strong> this preparatory session that these comments were made.Due to constraints within the academic timetable it was only possible to run the trainingsession 3 months before the students went on placement. It is proposed that the length<strong>of</strong> time between training and starting the wiki was far too long and many <strong>of</strong> the studentshad simply forgotten the information.It was also interesting to see that, although this assessment focused on remote working,30% <strong>of</strong> the students wanted more face-to-face group contact (Figure 7). Given thatthese were first year undergraduate students, this result may arise due to a lack <strong>of</strong> confidencein the technology or experience in distance group working activity.The strengths <strong>of</strong> this assessment were that tutors could use real clinical issues to aidlearning and students explored and discussed them in relevant ways. It enabled reflectivepractice and 58% <strong>of</strong> students felt they learned about infection control through theactivity (Figure 4). New technology was introduced to the cohort and was used to meetthe aims <strong>of</strong> the study. Boulos, Maramba and Wheeler (2006) suggest that in the absence<strong>of</strong> direct contact with an academic institution and with reduced social presencefrom peers wiki technology does enable learner support. This view is now supported bythe academic staff involved in this assessment activity.Recommendations for future useIdeally, students could engage in a formative wiki exercise to increase their confidencein the technology, introduce them to the complexities <strong>of</strong> remote group working and allowthem time to develop strategies to help them during the summative assessment. However,Cubric (2007) suggests that the nature <strong>of</strong> formative work is such that studentsmay well choose not to engage and therefore any benefit from this activity is lost. Boulos,Maramba and Wheeler (2006) suggest that re-educating students to participatewithin a distance learning environment may be necessary and support may be neededwhen they first try to communicate with each other using the new collaborative technologies.In order for students to retain more <strong>of</strong> the information about the assessment activity andprocess, the timing <strong>of</strong> the pre placement lecture should be further considered andmoved closer to the placement start date. This may help to decrease the perception forsome students that there was a lack <strong>of</strong> information regarding the required work.ConclusionIn summary, specific placement learning about infection prevention issues in first yearundergraduate physiotherapy students was enabled through the use <strong>of</strong> a group wiki as-Blended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...55signment. The functionality <strong>of</strong> the wiki enabled academic staff to evaluate students‟ individualand group effort in the assignment and the wikis could subsequently be used as asource <strong>of</strong> information for other cohorts <strong>of</strong> students. Students were introduced to the concept<strong>of</strong> remote collaborative working and gained valuable experience in the use <strong>of</strong> IT which supportsthis web based activity. Future considerations include the introduction <strong>of</strong> a preplacementinformation lecture nearer the commencement <strong>of</strong> the placement, enhanced studentpreparation regarding the nature <strong>of</strong> remote collaborative working and a formative piece<strong>of</strong> work which supports the final summative assignment.AcknowledgementsI would like to thank my colleague Scott Rickard for his encouragement and JenneferHodges for her constructive criticism during the preparation <strong>of</strong> this article.ReferencesAlexander, B. (2006). Web 2.0: A new wave <strong>of</strong> innovation for teaching and learning? 41(2)Retrieved January 13 th 2011 from http://www.educause.edu/ir/library/pdf/ERM0621.pdfBoulos,M.N.K., Maramba,I &Wheeler,S.(2006) Wikis, Blogs and Podcasts: A new generation<strong>of</strong> web-based tools for virtual collaborative clinical practice and education. BMC MedicalEducation 2006, 6,41. Accessed January 2011 from http://www.biomedcentral.com/1472-6920/6/41Chartered Society <strong>of</strong> Physiotherapy (July 1997) Guidelines for good practice for the education<strong>of</strong> Clinical Educators. London: Chartered Society <strong>of</strong> Physiotherapy.Chickering, A.W. & Gamson, Z.F.(1987). Seven Principles for good practice in undergraduateeducation. American Association for Higher Education Bulletin. 3-7.Cubric, M. (2007).Wiki-based process framework for blended learning. Proceedings <strong>of</strong> Wikisym2007: The 2007 International Symposium on Wikis. pp11-22. New York. Association forComputing MachineryDuffy, P & Bruns, A.(2006). The use <strong>of</strong> blogs, wikis and RSS in Education: A conversation<strong>of</strong> possibilities. Proceedings <strong>of</strong> the online Learning and Teaching Conference 2006, Brisba n e : S e p t e m b e r 2 6 . R e t r i e v e d J a n u a r y 2 0 1 1 f r o m h t t p : / /eprints.qut.edu.au/5398/1/5398.pdfGibbs, G.(1988) Learning by Doing: A guide to Teaching and Learning Methods. Oxford:Oxford Further Education Unit, Oxford Polytechnic.Higher Education Academy. (2007).The UK Pr<strong>of</strong>essional Standards Framework for teachingand supporting learning in higher education. Retrieved 12 January 2011, from http://Blended Learning In Practice May 2011


The creation <strong>of</strong> group wikis ...56www.heacademy.ac.uk/assets/York/documents/ourwork/rewardandrecog/Pr<strong>of</strong>essionalStandardsFramework.pdfHigher Education Funding Council for England,(2005). HEFCE strategy for E-learning.London:HEFCEHoney, P. and Mumford, A.(1992) The Manual <strong>of</strong> Learning Styles. Maidenhead: PeterHoney publicationsJohnson, R.T. & Johnson,D.W.(1986) Action research: Cooperative learning in the scienceclassroom. Science and Children, 24, 31-32Miers,J.(2004) BELTS or braces? Technology school <strong>of</strong> the future. Retrieved nov 2006from http://www.ts<strong>of</strong>.edu.au/research/Reports04/miers.aspNational Institute for Clinical Excellence, (2003) Guidelines for Infection Control. London:NICEParker, K.R. & Chao, J.T. (2007). Wiki as a Teaching Tool. Interdisciplinary Journal <strong>of</strong>knowledge and Learning object, 3, 57-72.Wang, C.& Turner, D. (2004). Extending the wiki paradigm for use in the classroom. Proceedings<strong>of</strong> the International Conference on Information technology (ITCC 2004) LasVega, Nevada, USA: April 5-7, 255-259. Retrieved January 2011 from http://w w w . p g c e . s o t o n . a c . u k / i c t / N e w P G C E / P D F s /Extendingthewikiparadigmforuseintheclassroom.pdfBlended Learning In Practice May 2011


Student Voice57 Student VoiceAssessment and feedback: the student perspectiveIn the light <strong>of</strong> exercises such as the UK National Student (http://www.thestudentsurvey.com/) much time and effort has been devoted in particular to theissues <strong>of</strong> assessment and feedback. However, are we listening to the student viewpointas we endeavour to improve these critical aspects <strong>of</strong> education? To conclude this assessment-themedissue <strong>of</strong> Blended Learning in Practice we ask two <strong>University</strong> <strong>of</strong> <strong>Hertfordshire</strong>students for their opinions on assessment and feedback.Phil Porter (p.r.porter@herts.ac.uk) asks the questions and the answers are not alwayswhat we might expect! Please click on either <strong>of</strong> the screen shots to access the studentvideos.Click on any <strong>of</strong> the screenshots below to access the videos.Blended Learning In Practice May 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!