Support Vector Machine - Soft Margin

Soft Margin

In 1995, Corinna Cortes and Vladimir N. Vapnik suggested a modified maximum margin idea that allows for mislabeled examples. If there exists no hyperplane that can split the "yes" and "no" examples, the Soft Margin method will choose a hyperplane that splits the examples as cleanly as possible, while still maximizing the distance to the nearest cleanly split examples. The method introduces slack variables, which measure the degree of misclassification of the data

The objective function is then increased by a function which penalizes non-zero, and the optimization becomes a trade off between a large margin and a small error penalty. If the penalty function is linear, the optimization problem becomes:

subject to (for any )

This constraint in (2) along with the objective of minimizing can be solved using Lagrange multipliers as done above. One has then to solve the following problem:

\min_{\mathbf{w},\mathbf{\xi}, b } \max_{\boldsymbol{\alpha},\boldsymbol{\beta} }
\left \{ \frac{1}{2}\|\mathbf{w}\|^2
+C \sum_{i=1}^n \xi_i
- \sum_{i=1}^{n}{\alpha_i}
- \sum_{i=1}^{n} \beta_i \xi_i \right \}

with .

Read more about this topic:  Support Vector Machine

Famous quotes containing the words soft and/or margin:

    To rest, the cushion and soft dean invite,
    Who never mentions hell to ears polite.
    Alexander Pope (1688–1744)

    I love a broad margin to my life.
    Henry David Thoreau (1817–1862)