Random Naive Bayes - Random Naive Bayes and Random Forest

Random Naive Bayes and Random Forest

Generalizing Random Forest to Naive Bayes, Random Naive Bayes (Random NB), is a bagged classifier combining a forest of B Naive Bayes. Each bth Naive Bayes is estimated on a bootstrap sample Sb with m randomly selected features. To classify an observation put the input vector down the B Naive Bayes in the forest. Each Naive Bayes generates posterior class probabilities. Unlike Random Forest, the predicted class of the ensemble is assessed by adjusted majority voting rather than majority voting, as each bth Naive Bayes delivers continuous posterior probabilities. Similar to Random Forests, the importance of each feature is estimated on the out-of-bag (oob) data.

Read more about this topic:  Random Naive Bayes

Famous quotes containing the words random, naive and/or forest:

    poor Felix Randal;
    How far from then forethought of, all thy more boisterous years,
    When thou at the random grim forge, powerful amidst peers,
    Didst fettle for the great gray drayhorse his bright and battering
    sandal!
    Gerard Manley Hopkins (1844–1889)

    It would be naive to think that peace and justice can be achieved easily. No set of rules or study of history will automatically resolve the problems.... However, with faith and perseverance,... complex problems in the past have been resolved in our search for justice and peace. They can be resolved in the future, provided, of course, that we can think of five new ways to measure the height of a tall building by using a barometer.
    Jimmy Carter (James Earl Carter, Jr.)

    The reason is:
    rats leave the sinking ship
    but we . . .
    we . . .
    didn’t leave,
    so the ship
    didn’t sink,
    and that’s madness,
    Lear’s song
    that’s Touchstone’s forest jest,
    that’s swan of Avon logic.
    Hilda Doolittle (1886–1961)