Random Naive Bayes - Random Naive Bayes and Random Forest

Random Naive Bayes and Random Forest

Generalizing Random Forest to Naive Bayes, Random Naive Bayes (Random NB), is a bagged classifier combining a forest of B Naive Bayes. Each bth Naive Bayes is estimated on a bootstrap sample Sb with m randomly selected features. To classify an observation put the input vector down the B Naive Bayes in the forest. Each Naive Bayes generates posterior class probabilities. Unlike Random Forest, the predicted class of the ensemble is assessed by adjusted majority voting rather than majority voting, as each bth Naive Bayes delivers continuous posterior probabilities. Similar to Random Forests, the importance of each feature is estimated on the out-of-bag (oob) data.

Read more about this topic:  Random Naive Bayes

Famous quotes containing the words random, naive and/or forest:

    Novels as dull as dishwater, with the grease of random sentiments floating on top.
    Italo Calvino (1923–1985)

    It would be naive to think that peace and justice can be achieved easily. No set of rules or study of history will automatically resolve the problems.... However, with faith and perseverance,... complex problems in the past have been resolved in our search for justice and peace. They can be resolved in the future, provided, of course, that we can think of five new ways to measure the height of a tall building by using a barometer.
    Jimmy Carter (James Earl Carter, Jr.)

    Nature has from the first expanded the minute blossoms of the forest only toward the heavens, above men’s heads and unobserved by them. We see only the flowers that are under our feet in the meadows.
    Henry David Thoreau (1817–1862)