Kappa Statistics
- Main articles: Cohen's kappa, Fleiss' kappa
Cohen's kappa, which works for two raters, and Fleiss' kappa, an adaptation that works for any fixed number of raters, improve upon the joint probability in that they take into account the amount of agreement that could be expected to occur through chance. They suffer from the same problem as the joint-probability in that they treat the data as nominal and assume the ratings have no natural ordering. If the data do have an order, the information in the measurements is not fully taken advantage of.
Read more about this topic: Inter-rater Reliability
Famous quotes containing the word statistics:
“We ask for no statistics of the killed,
For nothing political impinges on
This single casualty, or all those gone,
Missing or healing, sinking or dispersed,
Hundreds of thousands counted, millions lost.”
—Karl Shapiro (b. 1913)