Subsections
The AdaBoost algorithm allows to combine a set of weak rules to more
precise resulting classifier.
Let
be a training set of
observable vectors
and corresponding binary hidden
states
. The AdaBoost classifier sequentially calls a weak
learner its input is the set and a discrete distribution over
the training examples. A task of the weak learner is to train a rule
which minimizes the weighted error
|
(3) |
is -loss function which attains for
and for
. The weak learner is required to
produce rule with weighted error (3) at least a bit
better than random guessing, i.e.,
. The AdaBoost combines
the weak rules to a more precise final classifier
where , are weights assigned by the AdaBoost to each
rule.
The AdaBoost is proven to decrease in each step an upper bound on the
empirical error
Therefore the AdaBoost can produce the classifier with arbitrary low
empirical error
which, however, does not have to necessarily
correspond to low generalization error. This implies that the number of
weak rules must be appropriately chosen. The cross-validation is a possible
choice to resolve this problem.
A selection of proper weak rules is application related problem. A useful
example is a weak rule defined as
|
(4) |
where
is a feature index and
is a bias.
The parameters and are trained to minimize the weighted
error (3). The weak rule (4) uses just
one feature for classification. The AdaBoost algorithm combining
rules of this type can be used as a feature selection method.
- Implement learning algorithm of weak rule (4) and
incorporate it to the AdaBoost algorithm implemented in function
adaboost of the STPRtool.
- Apply the AdaBoost algorithm to train binary classifier for Brodatz
textures and
[brodatz1_trn.mat].
Use the cross-validation procedure to select the proper number of weak
rules.
- Validate the resulting SVM classifier on the testing data
[brodatz1_tst.mat].
adaboost_exp1 |
Example on
AdaBoost. |
weak_learner2d |
Example of
weak learner for AdaBoost. |
crossval |
Partitions data for cross-validation. |
adaboost |
AdaBoost training algorithm. |
adaclass |
AdaBoost classifier. |
Vojtech Franc
2004-08-31