Naive Bayes Classifier - Constructing A Classifier From The Probability Model

Constructing A Classifier From The Probability Model

The discussion so far has derived the independent feature model, that is, the naive Bayes probability model. The naive Bayes classifier combines this model with a decision rule. One common rule is to pick the hypothesis that is most probable; this is known as the maximum a posteriori or MAP decision rule. The corresponding classifier is the function defined as follows:

Read more about this topic:  Naive Bayes Classifier

Famous quotes containing the words constructing, probability and/or model:

    The very hope of experimental philosophy, its expectation of constructing the sciences into a true philosophy of nature, is based on induction, or, if you please, the a priori presumption, that physical causation is universal; that the constitution of nature is written in its actual manifestations, and needs only to be deciphered by experimental and inductive research; that it is not a latent invisible writing, to be brought out by the magic of mental anticipation or metaphysical mediation.
    Chauncey Wright (1830–1875)

    Legends of prediction are common throughout the whole Household of Man. Gods speak, spirits speak, computers speak. Oracular ambiguity or statistical probability provides loopholes, and discrepancies are expunged by Faith.
    Ursula K. Le Guin (b. 1929)

    The Battle of Waterloo is a work of art with tension and drama with its unceasing change from hope to fear and back again, change which suddenly dissolves into a moment of extreme catastrophe, a model tragedy because the fate of Europe was determined within this individual fate.
    Stefan Zweig (18811942)