10.07.2015 Views

ICCS 2009 Technical Report - IEA

ICCS 2009 Technical Report - IEA

ICCS 2009 Technical Report - IEA

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Figure 11.4 shows the item-by-country interaction graph for item CI2HRM2, which was notretained for international scaling. The figure shows clear and considerable variation in theitem difficulties across countries. Similar graphs produced for each test item were used in thetest-item adjudication process at the international and national levels, while information aboutoccurrence of cross-national DIF was used to identify items for post-verification checks aftercompletion of the main data collection.Figure 11.4: Example of item-by-country interaction graph for Item CI2HRM26.05.04.03.02.0Probability1.00.0-1.0-2.0-3.0BGRCHLTWNCOLCYPDNKDOMESTFINGRCGTMHKGIDNIRLITAAUTKORLVALIELTULUXMLTMEXNLDNZLBFLNORPRYPOLRUSSVKSVNESPSWECHETHAENG-4.0-5.0-6.0Although the <strong>ICCS</strong> test items showed generally only limited item-by-country interactions, therewere some national item difficulties that deviated quite considerably from the international itemdifficulty. In these cases, these items were omitted from scaling for those national samples wherelarger deviations were observed.Item-by-country interaction was also examined for the open-ended items. With these items,item-by-country interaction can be evidence of differences in the relative harshness of markersacross countries. Comparison of the relative difficulties of open-ended items with multiplechoiceitems across all countries made evident that students in the Dominican Republic andIndonesia appeared to find it easier to answer the open-ended items correctly than did studentsin the other countries. This situation suggested problems with how the scoring procedures wereconducted. All open-ended items for these two countries were subsequently removed fromscaling and the international database.Missing data issuesThere were three possible types of missing responses in the <strong>ICCS</strong> test. These were omitteditems (coded as 9), not-administered items (coded as 8), and invalid responses (coded as 7).The omitted response category was used when a student provided no response at all to an itemadministered to him or her. Not-administered items were those that, although in the wholeitem pool, were not in a booklet administered to a student either deliberately (when there werealternative or rotated test booklets) or, in rare cases, in error. Invalid responses occurred when,for example, students ticked more than one of the possible answers to a multiple-choice item.SCALING PROCEDURES FOR <strong>ICCS</strong> TEST ITEMS137

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!