12.07.2015 Views

Complete Issue PDF - University of Alberta Health Sciences Journal

Complete Issue PDF - University of Alberta Health Sciences Journal

Complete Issue PDF - University of Alberta Health Sciences Journal

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

COMMENTARYOSCE: The subjective experience <strong>of</strong> an objective examAlim Nagji, BHScMedical Student (2012), Faculty <strong>of</strong> Medicine and Dentistry, <strong>University</strong> <strong>of</strong> <strong>Alberta</strong>, Edmonton, CanadaCorrespondence to Alim Nagji, Email: anagji@ualberta.caAbstractThe Objective Structured Clinical Examination (OSCE) is theprimary modality for testing clinical skills throughout medicalschool and in residency training. This article explores the difficulties<strong>of</strong> the exam via the subjective perspective <strong>of</strong> a student in thesystem, commenting on reticent standardized patients, the lack<strong>of</strong> consensus on what makes an ideal medical student and theabsence <strong>of</strong> feedback. As the exam celebrates nearly four decades inuse, it is important that we continue to evaluate its usefulness andbrainstorm innovative approaches to advancing the state <strong>of</strong> clinicalexaminations.The Objective Structure Clinical Examination (OSCE) has rapidlybecome the leading clinical examination in North American MedicalSchools. The unassailable champion <strong>of</strong> repetitive and reproducibleevaluation, it has become the envy <strong>of</strong> all other tests. While multiplechoice still holds a prominent position in most medical curricula,that is usually for the convenience <strong>of</strong> administering the test and thesheer volume <strong>of</strong> information one can sift through. As a learner, onehas had ample time to exploit the system, mastering the nuances <strong>of</strong>the “all <strong>of</strong> the following EXCEPT” and “which answer is the BEST”questions. In reality, we have been asked this type <strong>of</strong> question sincekindergarten and the PBS specials we grew up on sang “one <strong>of</strong> thesethings is not like the other one.” 1The OSCE is succinctly explained in the abstract <strong>of</strong> a 1975 BMJ article:“the examination is more objective and a marking strategy can bedecided in advance.” 2 The veracity <strong>of</strong> the latter half <strong>of</strong> this requirementis evidenced by the rubric style approach <strong>of</strong> many modern OSCEs.However, the first part is where the interesting dilemma lies. Is theexamination more objective? From the subjective experience <strong>of</strong>students, it would hardly seem so.The variability lies in the innate qualities <strong>of</strong> both the “standardizedpatient” and the examiner. Having worked as a standardized patient,the instructions one <strong>of</strong>ten receives is to be as guarded as possibleabout information, refraining from volunteering details unlessspecifically probed. This is in direct contrast to the guidance <strong>of</strong>feredin basic history taking skills, where students are counselled to allowthe patient to convey the narrative <strong>of</strong> their illness uninterrupted. Forthose that have participated in OSCEs, it is easy to recall those actorsfrom whom information had to be stolen as if it were precious gems.From many a station I have walked out and, in conversation withmy peers in different tracks, realized that another “standardized”patient had been much more forthcoming with a pivotal piece <strong>of</strong> thediagnostic puzzle.In Rowntree’s 17 proposals for better assessment, he notes that“there is an assumption rampant in talk <strong>of</strong> academic standards,that all qualified assessors feel, understand and judge in much thesame way when confronted with the work <strong>of</strong> a particular student.It is presumed that they would notice and value the same skills andqualities and would broadly agree in their assessments. Abundantevidence attests to the falsity <strong>of</strong> such assumptions.” 3 In the sameway, examiners vary widely in their preferences <strong>of</strong> what they believemakes the ideal learner. One need only glance at the complicatedmedical admission system or the behemoth that is the CanadianResident Matching Service (CaRMS) to realize that we cannot agreeon the perfect model student, yet we continue to cling to antiquatedstandards so as to maintain a united front. Despite the broadaccusations suggesting poor inter-rater reliability across a variety<strong>of</strong> domains, 4 OSCE examinations remain a mainstay <strong>of</strong> evaluationdespite their artificial construction and potentially variableenvironment.In the same seminal article, the authors proclaim that the“examination results in improved feed-back to students and staff.” 2This may hold true for the teaching OSCEs, where 2 minutes <strong>of</strong>personalized commentary follows each station, but for the majority<strong>of</strong> exams in medicine, the results are protected and not released. Sowhile occasionally one may receive a grade or score sheet, one isleft waiting for the commentary that can enhance clinical skills orrefine an approach. The majority <strong>of</strong> instructors emphasize the needto train physicians, not test takers, yet the very nature <strong>of</strong> receiving apass or a fail undermines the learning process. Research has shownthat overall, detailed, descriptive feedback was found to be mosteffective when given alone, unaccompanied by grades or praise, thedirect opposite <strong>of</strong> what students usually receive. 5 The OSCE hassignificant advantages over multiple choice questions, providing arich opportunity for students to simulate patient encounters andmaintain some degree <strong>of</strong> standardization. However, it’s limitationsand shortcomings should be discussed, rather than disputed. Themodern OSCE, nearly 40 years after its rise to prominence, seemsstagnant in the face <strong>of</strong> the rapid change in the medical community.Perhaps as we enter a new decade <strong>of</strong> medical education we cancritique our instruments, as well as our students, and developinnovative models to evaluate competence in clinical skills.1. Cooney, J.G. (creator). Seasame Street [Television Series]. New York:PBS 1969.2. Harden, R. McG. Stevenson, M., Downie, W.W. & Wilson, G.M.Assessment <strong>of</strong> clinical competence using objective structuredexaminations. British Medical <strong>Journal</strong> 1975;I: 447.3. Rowntree, D. Assessing students: how shall we know them?London: Kogan Page 1977. As cited in: Harden, R., Gleeson, F.A.Assessment <strong>of</strong> clinical competence using an objective structuredclinical examination (OSCE). Medical Education 1979;13(1):39-54.4. Thistlethwaite, JE. Developing an OSCE station to assess theability <strong>of</strong> medical students to share information and decisionswith patients: issues relating to interrater reliability and the use <strong>of</strong>simulated patients. Educ <strong>Health</strong> (Abingdon) 2002;15(2):170-9.5. Lipnevich, A.A., Smith, J.K. Response to assessment feedback: Theeffects <strong>of</strong> grades, praise and sources <strong>of</strong> information. Retrieved fromProQuest Digital Dissertations 2008; 3319438.2<strong>University</strong> <strong>of</strong> <strong>Alberta</strong> <strong>Health</strong> <strong>Sciences</strong> <strong>Journal</strong> • April 2012 • Volume 7 • <strong>Issue</strong> 1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!