In predictive analytics, a table of confusion (sometimes also called a confusion matrix), is a table with two rows and two columns that reports the number of false positives, false negatives, true positives, and true negatives. This allows more detailed analysis than mere proportion of correct guesses (accuracy). Accuracy is not a reliable metric for the real performance of a classifier, because it will yield misleading results if the data set is unbalanced (that is, when the number of samples in different classes vary greatly). For example, if there were 95 cats and only 5 dogs in the data set, the classifier could easily be biased into classifying all the samples as cats. The overall accuracy would be 95%, but in practice the classifier would have a 100% recognition rate for the cat class but a 0% recognition rate for the dog class.
Assuming the confusion matrix above, its corresponding table of confusion, for the cat class, would be:
5 true positives |
3 false negatives |
2 false positives |
17 true negatives |
The final table of confusion would contain the average values for all classes combined.
Read more about this topic: Confusion Matrix
Famous quotes containing the words table and/or confusion:
“Language was vigorous because, because ... editors usually laid all the cards on the table so as to leave their hands ... free for more persuasive arguments! The citizenry at large retaliated as best they could.”
—State of Utah, U.S. public relief program (1935-1943)
“The LORD will afflict you with madness, blindness, and confusion of mind; you shall grope about at noon as blind people grope in darkness, but you shall be unable to find your way; and you shall be continually abused and robbed, without anyone to help.”
—Bible: Hebrew, Deuteronomy 28:28,29.