Inter-rater Reliability - The Philosophy of Inter-rater Agreement

The Philosophy of Inter-rater Agreement

There are several operational definitions of "inter-rater reliability" in use by Examination Boards, reflecting different viewpoints about what is reliable agreement between raters.

There are three operational definitions of agreement:

1. Reliable raters agree with the "official" rating of a performance.

2. Reliable raters agree with each other about the exact ratings to be awarded.

3. Reliable raters agree about which performance is better and which is worse.

These combine with two operational definitions of behavior:

A. Reliable raters are automatons, behaving like "rating machines". This category includes rating of essays by computer . This behavior can be evaluated by Generalizability theory.

B. Reliable raters behave like independent witnesses. They demonstrate their independence by disagreeing slightly. This behavior can be evaluated by the Rasch model.

Read more about this topic:  Inter-rater Reliability

Famous quotes containing the words philosophy and/or agreement:

    My philosophy is that to be a director you cannot be subject to anyone, even the head of the studio. I threatened to quit each time I didn’t get my way, but no one ever let me walk out.
    Dorothy Arzner (1900–1979)

    Truth cannot be defined or tested by agreement with ‘the world’; for not only do truths differ for different worlds but the nature of agreement between a world apart from it is notoriously nebulous. Rather—speaking loosely and without trying to answer either Pilate’s question or Tarski’s—a version is to be taken to be true when it offends no unyielding beliefs and none of its own precepts.
    Nelson Goodman (b. 1906)