Random Naive Bayes - Random Naive Bayes and Random Forest

Random Naive Bayes and Random Forest

Generalizing Random Forest to Naive Bayes, Random Naive Bayes (Random NB), is a bagged classifier combining a forest of B Naive Bayes. Each bth Naive Bayes is estimated on a bootstrap sample Sb with m randomly selected features. To classify an observation put the input vector down the B Naive Bayes in the forest. Each Naive Bayes generates posterior class probabilities. Unlike Random Forest, the predicted class of the ensemble is assessed by adjusted majority voting rather than majority voting, as each bth Naive Bayes delivers continuous posterior probabilities. Similar to Random Forests, the importance of each feature is estimated on the out-of-bag (oob) data.

Read more about this topic:  Random Naive Bayes

Famous quotes containing the words random, naive and/or forest:

    Novels as dull as dishwater, with the grease of random sentiments floating on top.
    Italo Calvino (1923–1985)

    The naive notion that a mother naturally acquires the complex skills of childrearing simply because she has given birth now seems as absurd to me as enrolling in a nine-month class in composition and imagining that at the end of the course you are now prepared to begin writing War and Peace.
    Mary Kay Blakely (20th century)

    If I were a Brazilian without land or money or the means to feed my children, I would be burning the rain forest too.
    Sting [Gordon Matthew Sumner] (b. 1951)