Generalization Error Bounds
One theoretical motivation behind margin classifiers is that their generalization error may be bound by parameters of the algorithm and a margin term. An example of such a bound is for the AdaBoost algorithm. Let be a set of examples sampled independently at random from a distribution . Assume the VC-dimension of the underlying base classifier is and . Then with probability we have the bound
for all .
Read more about this topic: Margin Classifier
Famous quotes containing the words error and/or bounds:
“Literature exists at the same time in the modes of error and truth; it both betrays and obeys its own mode of being.”
—Paul Deman (19191983)
“How far men go for the material of their houses! The inhabitants of the most civilized cities, in all ages, send into far, primitive forests, beyond the bounds of their civilization, where the moose and bear and savage dwell, for their pine boards for ordinary use. And, on the other hand, the savage soon receives from cities iron arrow-points, hatchets, and guns, to point his savageness with.”
—Henry David Thoreau (18171862)