Constructing A Classifier From The Probability Model
The discussion so far has derived the independent feature model, that is, the naive Bayes probability model. The naive Bayes classifier combines this model with a decision rule. One common rule is to pick the hypothesis that is most probable; this is known as the maximum a posteriori or MAP decision rule. The corresponding classifier is the function defined as follows:
Read more about this topic: Naive Bayes Classifier
Famous quotes containing the words constructing, probability and/or model:
“The very hope of experimental philosophy, its expectation of constructing the sciences into a true philosophy of nature, is based on induction, or, if you please, the a priori presumption, that physical causation is universal; that the constitution of nature is written in its actual manifestations, and needs only to be deciphered by experimental and inductive research; that it is not a latent invisible writing, to be brought out by the magic of mental anticipation or metaphysical mediation.”
—Chauncey Wright (18301875)
“The source of Pyrrhonism comes from failing to distinguish between a demonstration, a proof and a probability. A demonstration supposes that the contradictory idea is impossible; a proof of fact is where all the reasons lead to belief, without there being any pretext for doubt; a probability is where the reasons for belief are stronger than those for doubting.”
—Andrew Michael Ramsay (16861743)
“For an artist to marry his model is as fatal as for a gourmet to marry his cook: the one gets no sittings, and the other gets no dinners.”
—Oscar Wilde (18541900)