Inter-rater Reliability - Kappa Statistics

Kappa Statistics

Main articles: Cohen's kappa, Fleiss' kappa

Cohen's kappa, which works for two raters, and Fleiss' kappa, an adaptation that works for any fixed number of raters, improve upon the joint probability in that they take into account the amount of agreement that could be expected to occur through chance. They suffer from the same problem as the joint-probability in that they treat the data as nominal and assume the ratings have no natural ordering. If the data do have an order, the information in the measurements is not fully taken advantage of.

Read more about this topic:  Inter-rater Reliability

Famous quotes containing the word statistics:

    Maybe a nation that consumes as much booze and dope as we do and has our kind of divorce statistics should pipe down about “character issues.” Either that or just go ahead and determine the presidency with three-legged races and pie-eating contests. It would make better TV.
    —P.J. (Patrick Jake)