06.02.2013 Views

Research in Engineering Education Symposium 2011 - rees2009

Research in Engineering Education Symposium 2011 - rees2009

Research in Engineering Education Symposium 2011 - rees2009

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Universidad Politécnica de Madrid (UPM) Pág<strong>in</strong>a 720 de 957<br />

We are conduct<strong>in</strong>g a rigorous exam<strong>in</strong>ation of the extent to which CIs “measure up” to the<br />

claims made about them as assessment tools and an exam<strong>in</strong>ation of the ways they can be<br />

useful to eng<strong>in</strong>eer<strong>in</strong>g educators. The project’s research program is driven by a validity<br />

analysis framework (outl<strong>in</strong>ed below) focused on three broad research questions. What<br />

does a given CI really measure relative to its conceptual foundations and the <strong>in</strong>tended<br />

measurement constructs? Do questions and multiple choice distractors <strong>in</strong> CIs support<br />

diagnostic methods to reliably identify desired conceptual understand<strong>in</strong>gs as well as<br />

robust misconceptions (e.g., “heat and temperature are equivalent”)? How should<br />

<strong>in</strong>structors th<strong>in</strong>k about <strong>in</strong>corporat<strong>in</strong>g CIs <strong>in</strong>to their <strong>in</strong>structional practice and does add<strong>in</strong>g<br />

diagnostic measurement and report<strong>in</strong>g lead to formative use of CIs <strong>in</strong> classrooms and<br />

improved student learn<strong>in</strong>g?<br />

Theoretical Framework<br />

The STEM area CIs under study provide good examples of the use of conceptual models of<br />

understand<strong>in</strong>g <strong>in</strong> <strong>in</strong>structional doma<strong>in</strong>s to systematically generate sets of test questions.<br />

The result<strong>in</strong>g student performance data have the potential for rigorous <strong>in</strong>terpretation<br />

relative to a cognitive model of doma<strong>in</strong> understand<strong>in</strong>g. When such an assessment is<br />

designed and then validated, the <strong>in</strong>formation about student understand<strong>in</strong>g and<br />

performance that it generates should be useable for chang<strong>in</strong>g the conditions of <strong>in</strong>struction<br />

and the nature of student learn<strong>in</strong>g outcomes. Successful change, however, depends on: (1)<br />

develop<strong>in</strong>g methods for extract<strong>in</strong>g relevant diagnostic <strong>in</strong>formation <strong>in</strong> a timely and<br />

rigorous manner, and (2) validat<strong>in</strong>g the <strong>in</strong>strument relative to its <strong>in</strong>tended purposes and<br />

uses.<br />

Our fram<strong>in</strong>g of the validity analysis challenge with CIs is based on the “assessment as<br />

reason<strong>in</strong>g from evidence” framework articulated <strong>in</strong> the National <strong>Research</strong> Council report<br />

Know<strong>in</strong>g What Students Know: The Science and Design of <strong>Education</strong>al Assessment<br />

(Pellegr<strong>in</strong>o, Chudowsky & Glaser, 2001). As argued there<strong>in</strong>, a high quality assessment fully<br />

attends to all three components of the “assessment triangle” and their <strong>in</strong>terconnections --<br />

cognition, observation, and <strong>in</strong>terpretation. Work to date <strong>in</strong> develop<strong>in</strong>g STEM concept<br />

<strong>in</strong>ventories represents a powerful explication of only two of the three critical components<br />

of the assessment triangle – cognition and observation. Failure to advance the<br />

<strong>in</strong>terpretation component of these assessments, <strong>in</strong> a detailed and equally powerful<br />

manner, severely restricts the nature of the <strong>in</strong>ferences that can be derived from CI data,<br />

and constra<strong>in</strong>s how the <strong>in</strong>terpretation of student performance might be used to guide<br />

<strong>in</strong>struction and improve student learn<strong>in</strong>g.<br />

CIs tend to be highly focused on a small set of key constructs and understand<strong>in</strong>gs with<strong>in</strong> a<br />

limited academic content doma<strong>in</strong>. Unlike typical assessments of student academic<br />

achievement, CI development is grounded <strong>in</strong> various forms of empirical evidence,<br />

theoretical <strong>in</strong>terpretation, and <strong>in</strong>structor judgment/<strong>in</strong>tuition about student understand<strong>in</strong>g<br />

<strong>in</strong> doma<strong>in</strong>s like Statics or Thermodynamics. Developers leverage these foundations to<br />

conceptualize and generate the question situations to present to students and to develop<br />

plausible multiple choice distractors l<strong>in</strong>ked to misconceptions. Thus, CIs elaborate the<br />

cognitive model regard<strong>in</strong>g the nature of student understand<strong>in</strong>g and its implications for the<br />

Proceed<strong>in</strong>gs of <strong>Research</strong> <strong>in</strong> Eng<strong>in</strong>eer<strong>in</strong>g <strong>Education</strong> <strong>Symposium</strong> <strong>2011</strong><br />

Madrid, 4 th - 7 th October <strong>2011</strong>

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!