Confusion Matrix - Table of Confusion

In predictive analytics, a table of confusion (sometimes also called a confusion matrix), is a table with two rows and two columns that reports the number of false positives, false negatives, true positives, and true negatives. This allows more detailed analysis than mere proportion of correct guesses (accuracy). Accuracy is not a reliable metric for the real performance of a classifier, because it will yield misleading results if the data set is unbalanced (that is, when the number of samples in different classes vary greatly). For example, if there were 95 cats and only 5 dogs in the data set, the classifier could easily be biased into classifying all the samples as cats. The overall accuracy would be 95%, but in practice the classifier would have a 100% recognition rate for the cat class but a 0% recognition rate for the dog class.

Assuming the confusion matrix above, its corresponding table of confusion, for the cat class, would be:

5 true positives
3 false negatives
2 false positives
17 true negatives

The final table of confusion would contain the average values for all classes combined.

Read more about this topic:  Confusion Matrix

Famous quotes containing the words table and/or confusion:

    When the painted birds laugh in the shade,
    When our table with cherries and nuts is spread:
    Come live, and be merry, and join with me
    To sing the sweet chorus of ‘Ha, ha, he!’
    William Blake (1757–1827)

    [Allegory] should ... be very sparingly practised, lest, whilst the writer plays with his own fancies and diverts himself by cutting the air with his wide spread wings, he should soar out of view of his readers, leaving them in confusion and perplexity to explore his viewless track.
    Sarah Fielding (1710–1768)