Margin Classifier - Examples of Margin-based Algorithms

Examples of Margin-based Algorithms

Many classifiers can give an associated margin for each example. However, only some classifiers utilize information of the margin while learning from a data set.

Many boosting algorithms rely on the notion of a margin to give weights to examples. If a convex loss is utilized (as in AdaBoost, LogitBoost, and all members of the AnyBoost family of algorithms) then an example with higher margin will receive less (or equal) weight than an example with lower margin. This leads the boosting algorithm to focus weight on low margin examples. In nonconvex algorithms (e.g. BrownBoost), the margin still dictates the weighting of an example, though the weighting is non-monotone with respect to margin. There exists boosting algorithms that provably maximize the minimum margin (e.g. see ).

Support vector machines provably maximize the margin of the separating hyperplane. Support vector machines that are trained using noisy data (there exists no perfect separation of the data in the given space) maximize the soft margin. More discussion of this can be found in the support vector machine article.

The voted-perceptron algorithm is a margin maximizing algorithm based on an iterative application of the classic perceptron algorithm.

Read more about this topic:  Margin Classifier

Famous quotes containing the words examples of and/or examples:

    There are many examples of women that have excelled in learning, and even in war, but this is no reason we should bring ‘em all up to Latin and Greek or else military discipline, instead of needle-work and housewifry.
    Bernard Mandeville (1670–1733)

    There are many examples of women that have excelled in learning, and even in war, but this is no reason we should bring ‘em all up to Latin and Greek or else military discipline, instead of needle-work and housewifry.
    Bernard Mandeville (1670–1733)