Confusion Matrix - Table of Confusion

In predictive analytics, a table of confusion (sometimes also called a confusion matrix), is a table with two rows and two columns that reports the number of false positives, false negatives, true positives, and true negatives. This allows more detailed analysis than mere proportion of correct guesses (accuracy). Accuracy is not a reliable metric for the real performance of a classifier, because it will yield misleading results if the data set is unbalanced (that is, when the number of samples in different classes vary greatly). For example, if there were 95 cats and only 5 dogs in the data set, the classifier could easily be biased into classifying all the samples as cats. The overall accuracy would be 95%, but in practice the classifier would have a 100% recognition rate for the cat class but a 0% recognition rate for the dog class.

Assuming the confusion matrix above, its corresponding table of confusion, for the cat class, would be:

5 true positives
3 false negatives
2 false positives
17 true negatives

The final table of confusion would contain the average values for all classes combined.

Read more about this topic:  Confusion Matrix

Famous quotes containing the words table and/or confusion:

    A child who is not rigorously instructed in the matter of table manners is a child whose future is being dealt with cavalierly. A person who makes an admiral’s hat out of linen napkins is not going to be in wild social demand.
    Fran Lebowitz (20th century)

    The small force that it takes to launch a boat into the stream should not be confused with the force of the stream that carries it along: but this confusion appears in nearly all biographies.
    Friedrich Nietzsche (1844–1900)