27.10.2014 Views

Russel-Research-Method-in-Anthropology

Russel-Research-Method-in-Anthropology

Russel-Research-Method-in-Anthropology

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Qualitative Data Analysis I: Text Analysis 513<br />

just by chance. To adjust for this possibility, many researchers use a statistic<br />

called Cohen’s kappa (Cohen 1960), or k.<br />

Cohen’s kappa<br />

Kappa is a statistic that measures how much better than chance is the agreement<br />

between a pair of coders on the presence or absence of b<strong>in</strong>ary (yes/no)<br />

themes <strong>in</strong> texts. Here is the formula for kappa:<br />

Observed Chance<br />

k Formula 17.1<br />

1 Chance<br />

When k is 1.0, there is perfect agreement between coders. When k is zero,<br />

agreement is what might be expected by chance. When k is negative, the<br />

observed level of agreement is less than what you’d expect by chance. And<br />

when k is positive, the observed level of agreement is greater than what you’d<br />

expect by chance. Table 17.7 shows the data <strong>in</strong> table 17.6 rearranged so that<br />

we can calculate kappa.<br />

Coder 1<br />

TABLE 17.7<br />

The Coder-by-Coder Agreement Matrix for the Data <strong>in</strong> Table 17.6<br />

Coder 2<br />

Yes No Coder 1 totals<br />

Yes 1 (a) 1 (b) 2<br />

No 3 (c) 5 (d) 8<br />

Coder 2 totals 4 6 10 (n)<br />

The observed agreement between Coder 1 and Coder 2 is:<br />

a d<br />

n<br />

Here, Coder 1 and Coder 2 agreed that the theme was present <strong>in</strong> the text once<br />

(cell a) and they agreed that the theme was absent five times (cell d), for a<br />

total of 6, or 60% of the 10 texts.<br />

The probability that Coder 1 and Coder 2 agree by chance is:<br />

a b<br />

n<br />

a c<br />

n<br />

c d<br />

n<br />

b d<br />

n<br />

Here, the probability that Coder 1 and Coder 2 agreed by chance is<br />

.08.48.56. Us<strong>in</strong>g formula 17.1, we calculate kappa:

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!