Random Naive Bayes and Random Forest
Generalizing Random Forest to Naive Bayes, Random Naive Bayes (Random NB), is a bagged classifier combining a forest of B Naive Bayes. Each bth Naive Bayes is estimated on a bootstrap sample Sb with m randomly selected features. To classify an observation put the input vector down the B Naive Bayes in the forest. Each Naive Bayes generates posterior class probabilities. Unlike Random Forest, the predicted class of the ensemble is assessed by adjusted majority voting rather than majority voting, as each bth Naive Bayes delivers continuous posterior probabilities. Similar to Random Forests, the importance of each feature is estimated on the out-of-bag (oob) data.
Read more about this topic: Random Naive Bayes
Famous quotes containing the words random, naive and/or forest:
“We should stop looking to law to provide the final answer.... Law cannot save us from ourselves.... We have to go out and try to accomplish our goals and resolve disagreements by doing what we think is right. That energy and resourcefulness, not millions of legal cubicles, is what was great about America. Let judgment and personal conviction be important again.”
—Philip K. Howard, U.S. lawyer. The Death of Common Sense: How Law Is Suffocating America, pp. 186-87, Random House (1994)
“The days have outnumbered
my fingers and toes.
What can I count with now?
Saying this,
the naive girl cries.”
—Hla Stavhana (c. 50 A.D.)
“You have debased [my] child.... You have made him a laughingstock of intelligence ... a stench in the nostrils of the gods of the ionosphere.”
—Lee, Dr. De Forest (18731961)