21.01.2013 Views

The Kolb Learning Style Inventory—Version 3.1 2005 - Whitewater ...

The Kolb Learning Style Inventory—Version 3.1 2005 - Whitewater ...

The Kolb Learning Style Inventory—Version 3.1 2005 - Whitewater ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

simulation intended to measure clinical management skills: the CBX consists of eight patient management simulations,<br />

each involving a patient with an unknown surgical problem. <strong>The</strong> simulation allows the student to obtain results<br />

of the history and physical examination, to order laboratory studies, to request radiology procedures, and to perform<br />

invasive/interventional procedures of surgeries. Beyond the presenting complaint, management is unprompted, and<br />

the student must balance the clinical evaluation with the acuity and progression of the clinical problem. Time advances<br />

during the simulation in proportion to the time necessary to perform each examination, laboratory study, or intervention<br />

(1998: 63). Of the 227 participants in the study, 102 (45%) were Converging learners, 59 (26%) Assimilating,<br />

48 (21%) Accommodating, and 18 (8%) were Diverging learners. <strong>The</strong> result indicated that Converging and Assimilating<br />

learners scored signifi cantly higher on the two multiple-choice performance measures, while no learning style<br />

difference was found on the CBX computer simulation. <strong>The</strong> authors concluded that the results support the <strong>Kolb</strong><br />

(1984) and Newland and Woelfl (1992) assertions that Converging and Assimilating learners may have a performance<br />

advantage on objective, single-best-answer, multiple choice examination. <strong>The</strong>y also concluded that the absence of<br />

relationships between learning style and CBX simulation suggests that multiple- choice examination and clinical case<br />

simulations measure different capabilities and achievements. Clinical management may require not only an abstract<br />

orientation supporting the acquisition, organization, and synthesis of preclinical basic science data, but also a concrete<br />

orientation involving pattern recognition and instinct. <strong>The</strong> data demonstrate the importance of evaluating learning<br />

outcomes by applying more than one type of examination format. Multiple-choice examinations favor abstract<br />

learners; however, clinical performance requires additional cognitive skills and abilities, and behaviors that are not<br />

adequately refl ected in objective measures of performance.<br />

Oughton and Reed (2000) measured the relationship between graduate students’ learning styles and performance outcome<br />

in a hypermedia environment in which students were required to structurally map out their acquired knowledge<br />

and grasp the interrelationships among various ideas and concepts. <strong>The</strong> dependent measures included the number<br />

of concepts, number of nodes, number of links, number of bidirectional links, number of multiple concept nodes,<br />

number of nodes with multiple links, levels of depth, preserved concepts, omitted concepts, and added concepts on<br />

each student’s map. <strong>The</strong> results showed that Assimilating and Diverging learners were the most productive on their<br />

concept maps. <strong>The</strong> authors concluded that this result can be attributed to the common traits shared by the two learning<br />

styles: the ability to see many perspectives and the ability to generate many ideas.<br />

Holley and Jenkins (1993) examined the impact of learning style on four different accounting exam question formats:<br />

multiple-choice theory (MCT), multiple-choice quantitative (MCQ), open-ended theory (OET), and open-ended<br />

quantitative (OEQ). <strong>The</strong>ir results indicated that there was a signifi cant performance difference by learning style for all<br />

but the multiple -choice quantitative format. On the active-refl ective learning style continuum, there was a signifi cant<br />

difference in students’ performance on the multiple choice theory format (p< .01) and the open-ended quantitative<br />

format (p< .05) with active students performing better. On the abstract-concrete learning style continuum, abstract<br />

students performed better on the open-ended theory format (p< .062). <strong>The</strong> authors concluded that students with<br />

different learning styles perform differently depending on the examination format, and that performance cannot be<br />

generalized for similar subjects if the testing format varies.<br />

This research suggests that educators need to exercise caution in evaluating performance based on a single outcome<br />

measure. Diverse assessment strategies are required to adequately measure student overall competence and performance.<br />

Experiential <strong>Learning</strong> in Teams<br />

Current research, involving different methodologies and different educational and workplace populations, has shown<br />

that ELT is useful for understanding team learning and performance (Adams, Kayes, and <strong>Kolb</strong> <strong>2005</strong>a). A number of<br />

studies support the proposition that a team is more effective if it learns from experience and emphasizes all four learning<br />

modes. Summarized below are studies of team member learning style, team roles, and team norms.<br />

31

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!