Random Naive Bayes - Random Naive Bayes and Random Forest

Random Naive Bayes and Random Forest

Generalizing Random Forest to Naive Bayes, Random Naive Bayes (Random NB), is a bagged classifier combining a forest of B Naive Bayes. Each bth Naive Bayes is estimated on a bootstrap sample Sb with m randomly selected features. To classify an observation put the input vector down the B Naive Bayes in the forest. Each Naive Bayes generates posterior class probabilities. Unlike Random Forest, the predicted class of the ensemble is assessed by adjusted majority voting rather than majority voting, as each bth Naive Bayes delivers continuous posterior probabilities. Similar to Random Forests, the importance of each feature is estimated on the out-of-bag (oob) data.

Read more about this topic:  Random Naive Bayes

Famous quotes containing the words random, naive and/or forest:

    There is a potential 4-6 percentage point net gain for the President [George Bush] by replacing Dan Quayle on the ticket with someone of neutral stature.
    Mary Matalin, U.S. Republican political advisor, author, and James Carville b. 1946, U.S. Democratic political advisor, author. All’s Fair: Love, War, and Running for President, p. 205, Random House (1994)

    Cynicism is full of naive disappointments.
    Mason Cooley (b. 1927)

    All nature is a temple where the alive
    Pillars breathe often a tremor of mixed words;
    Man wanders in a forest of accords
    That peer familiarly from each ogive.
    Allen Tate (1899–1979)