Generalization Error Bounds
One theoretical motivation behind margin classifiers is that their generalization error may be bound by parameters of the algorithm and a margin term. An example of such a bound is for the AdaBoost algorithm. Let be a set of examples sampled independently at random from a distribution . Assume the VC-dimension of the underlying base classifier is and . Then with probability we have the bound
for all .
Read more about this topic: Margin Classifier
Famous quotes containing the words error and/or bounds:
“They have their belief, these poor Tibet people, that Providence sends down always an Incarnation of Himself into every generation. At bottom some belief in a kind of pope! At bottom still better, a belief that there is a Greatest Man; that he is discoverable; that, once discovered, we ought to treat him with an obedience which knows no bounds. This is the truth of Grand Lamaism; the discoverability is the only error here.”
—Thomas Carlyle (17951881)
“Great Wits are sure to Madness near allid
And thin Partitions do their Bounds divide;
Else, why should he, with Wealth and Honour blest,
Refuse his Age the needful hours of Rest?”
—John Dryden (16311700)