Generalization Error Bounds
One theoretical motivation behind margin classifiers is that their generalization error may be bound by parameters of the algorithm and a margin term. An example of such a bound is for the AdaBoost algorithm. Let be a set of examples sampled independently at random from a distribution . Assume the VC-dimension of the underlying base classifier is and . Then with probability we have the bound
for all .
Read more about this topic: Margin Classifier
Famous quotes containing the words error and/or bounds:
“The peculiar evil of silencing the expression of an opinion is, that it is robbing the human race; posterity as well as the existing generation; those who dissent from the opinion, still more than those who hold it. If the opinion is right, they are deprived of the opportunity of exchanging error for truth: if wrong, they lose, what is almost as great a benefit, the clearer perception and livelier impression of truth, produced by its collision with error.”
—John Stuart Mill (18061873)
“Prohibition will work great injury to the cause of temperance. It is a species of intemperance within itself, for it goes beyond the bounds of reason in that it attempts to control a mans appetite by legislation, and makes a crime out of things that are not crimes. A Prohibition law strikes a blow at the very principles upon which our government was founded.”
—Abraham Lincoln (18091865)