Inter-rater Reliability - Kappa Statistics

Kappa Statistics

Main articles: Cohen's kappa, Fleiss' kappa

Cohen's kappa, which works for two raters, and Fleiss' kappa, an adaptation that works for any fixed number of raters, improve upon the joint probability in that they take into account the amount of agreement that could be expected to occur through chance. They suffer from the same problem as the joint-probability in that they treat the data as nominal and assume the ratings have no natural ordering. If the data do have an order, the information in the measurements is not fully taken advantage of.

Read more about this topic:  Inter-rater Reliability

Famous quotes containing the word statistics:

    We already have the statistics for the future: the growth percentages of pollution, overpopulation, desertification. The future is already in place.
    Günther Grass (b. 1927)