In predictive analytics, a table of confusion (sometimes also called a confusion matrix), is a table with two rows and two columns that reports the number of false positives, false negatives, true positives, and true negatives. This allows more detailed analysis than mere proportion of correct guesses (accuracy). Accuracy is not a reliable metric for the real performance of a classifier, because it will yield misleading results if the data set is unbalanced (that is, when the number of samples in different classes vary greatly). For example, if there were 95 cats and only 5 dogs in the data set, the classifier could easily be biased into classifying all the samples as cats. The overall accuracy would be 95%, but in practice the classifier would have a 100% recognition rate for the cat class but a 0% recognition rate for the dog class.
Assuming the confusion matrix above, its corresponding table of confusion, for the cat class, would be:
5 true positives |
3 false negatives |
2 false positives |
17 true negatives |
The final table of confusion would contain the average values for all classes combined.
Read more about this topic: Confusion Matrix
Famous quotes containing the words table and/or confusion:
“The table kills more people than war does.”
—Catalan proverb, quoted in Colman Andrews, Catalan Cuisine.
“Perfection of means and confusion of goals seemin my opinionto characterize our age.”
—Albert Einstein (18791955)