Boosting Methods For Object Categorization - Using Boosting Methods For Object Categorization - Boosting For Binary Categorization

Boosting For Binary Categorization

We use AdaBoost for face detection as an example of binary categorization. The two categories are faces versus background. The general algorithm is as follows:

  1. Form a large set of simple features
  2. Initialize weights for training images
  3. for T rounds
    1. Normalize the weights
    2. For available features from the set, train a classifier using a single feature and evaluate the training error
    3. Choose the classifier with the lowest error
    4. Update the weights of the training images: increase if classified wrongly by this classifier, decrease if correctly
  4. Form the final strong classifier as the linear combination of the T classifiers (coefficient larger if training error is small)

After boosting, a classifier constructed from 200 features could yield a 95% detection rate under a false positive rate.

Another application of boosting for binary categorization is a system which detects pedestrians using patterns of motion and appearance. This work is the first to combine both motion information and appearance information as features to detect a walking person. It takes a similar approach as the face detection work of Viola and Jones.

Read more about this topic:  Boosting Methods For Object Categorization, Using Boosting Methods For Object Categorization