12.07.2015 Views

file - ChaSen - 奈良先端科学技術大学院大学

file - ChaSen - 奈良先端科学技術大学院大学

file - ChaSen - 奈良先端科学技術大学院大学

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Table 4.5. Evaluation of Kappa Value.Evaluation Kappa valueNot Good .00 – .40Moderate .41 – .60Substantial .61 – .80Near perfect .81 – 1.00m∑P (E) = p 2 j (4.2)The extent of agreement among the raters regarding the ith article is theproportion of the number of pairs for which there is agreement with the possiblepairs of assignments. For the ith articles, this is computed by equation 4.3.S i =j=1∑ m( nijj=1 2)( k2) (4.3)To obtain the total proportion of agreement, I find the average of these proportionsacross all articles rated using equation 4.4.P (A) = 1 n∑S i (4.4)nTable 4.4 summarizes the criteria used to evaluate kappa values.It was impossible to obtain a kappa value for each domain because therewere description types with low frequencies as shown in Table 4.3. Instead, Icalculated the agreement of tagging by totalizing the six domains based on thesame summarization and found moderate levels of agreement for the Definitionand the Process types, and substantial level of agreement for the Order of time.There was also certain agreement in the Instance type in the Gardening and theSocial domains, and the Comparison type in the Healthcare and the Politicaldomains. For other combinations, no agreement was found or evaluation wasimpossible because of low frequencies.i=158

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!