10.07.2015 Views

ICCS 2009 Technical Report - IEA

ICCS 2009 Technical Report - IEA

ICCS 2009 Technical Report - IEA

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

PanelingPaneling is a team-based approach to reviewing assessment materials. This rigorous qualitycontrolmechanism is employed during the development of assessment materials. Paneling is aprocess that recognizes the importance of exposing material to multiple viewpoints. During thisprocess, a small group (between three and six) of test developers jointly review material thatone or more of them has developed. The review leads to acceptance, modification, or rejection.Panel participants compare their answers to the questions and raise issues about the questionsand the material. Discussion is robust because of the need to ensure that the selected itemsperform as intended.The following questions provide a summary of the issues that formed the focus of theevaluation of the item material developed for <strong>ICCS</strong>. The relevance of each evaluation issuevaried according to the individual characteristics of the material under consideration.Content validity• How did the material relate to the <strong>ICCS</strong> test specifications?• Did the questions test the content and cognitive processes described in the assessmentframework?• Did the questions relate to the essence of the stimulus or did they focus on trivial sideissues?• How would this material stand up to public scrutiny (including staff involved in theproject as well as members of the wider community)?Clarity and context• Was the material coherent, unambiguous, and clear?• Was the material interesting, worthwhile, and relevant?• Did the material assume prior knowledge and, if so, was this assumed to be acceptable orpart of what the test intended to measure?• Was the reading load as low as possible?• Were there idioms or syntactical structures likely to prove difficult to translate into otherlanguages?Format• Was the proposed format the most suitable for the content and process being assessed bythe item?• Was the key (the correct answer to a multiple-choice question) indisputably correct?• Were the distractors (the incorrect options to a multiple-choice question) plausible but alsoirrefutably incorrect?Test-takers• Did the test-item material match the expected range of ability levels, age, and maturity ofthe <strong>ICCS</strong> target population?• Did the material appear to be cross-culturally relevant and sensitive?• Were items likely to be easier or harder for certain subgroups in the target population forreasons other than differences in the ability measured by the test?• Did the constructed-response items provide clear guidance as to the expected answers tothe test question?24<strong>ICCS</strong> <strong>2009</strong> technical report

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!