Margin Classifier - Examples of Margin-based Algorithms

Examples of Margin-based Algorithms

Many classifiers can give an associated margin for each example. However, only some classifiers utilize information of the margin while learning from a data set.

Many boosting algorithms rely on the notion of a margin to give weights to examples. If a convex loss is utilized (as in AdaBoost, LogitBoost, and all members of the AnyBoost family of algorithms) then an example with higher margin will receive less (or equal) weight than an example with lower margin. This leads the boosting algorithm to focus weight on low margin examples. In nonconvex algorithms (e.g. BrownBoost), the margin still dictates the weighting of an example, though the weighting is non-monotone with respect to margin. There exists boosting algorithms that provably maximize the minimum margin (e.g. see ).

Support vector machines provably maximize the margin of the separating hyperplane. Support vector machines that are trained using noisy data (there exists no perfect separation of the data in the given space) maximize the soft margin. More discussion of this can be found in the support vector machine article.

The voted-perceptron algorithm is a margin maximizing algorithm based on an iterative application of the classic perceptron algorithm.

Read more about this topic:  Margin Classifier

Famous quotes containing the words examples of and/or examples:

    It is hardly to be believed how spiritual reflections when mixed with a little physics can hold people’s attention and give them a livelier idea of God than do the often ill-applied examples of his wrath.
    —G.C. (Georg Christoph)

    In the examples that I here bring in of what I have [read], heard, done or said, I have refrained from daring to alter even the smallest and most indifferent circumstances. My conscience falsifies not an iota; for my knowledge I cannot answer.
    Michel de Montaigne (1533–1592)