SMO |
Sequential Minimal Optimization for binary SVM with L1-soft margin.
Synopsis:
model = smo( data )
model = smo( data, options )
model = smo( data, options, init_model)
Description:
This function is implementation of the Sequential Minimal
Optimizer (SMO) [Platt98] to train the binary Support Vector
Machines Classifier with L1-soft margin.
Input:
data [struct] Binary labeled training vectors:
.X [dim x num_data] Training vectors.
.y [a x num_data] Labels (1 or 2).
options [struct] Control parameters:
.ker [string] Kernel identifier (default 'linear');
See 'help kernel'for more info.
.arg [1 x nargs] Kernel argument(s) (default 1).
.C Regularization constant (default C=inf). The constant C can
be given as:
C [1x1] .. for all data.
C [1x2] .. for each class separately C=[C1,C2].
C [1xnum_data] .. for each training vector separately.
.eps [1x1] SMO paramater (default 0.001).
.tol [1x1] Tolerance of KKT-conditions (default 0.001).
init_model [struct] Specifies initial model:
.Alpha [num_data x 1] Initial model.
.b [1x1] Bias.
If not given then it is set to zero by default.
Output:
model [struct] Binary SVM classifier:
.Alpha [nsv x 1] Weights (Lagrangians).
.b [1x1] Bias.
.sv.X [dim x nsv] Support vectors.
.nsv [1x1] Number of Support Vectors.
.kercnt [1x1] Number of kernel evaluations used by SMO.
.trnerr [1x1] Training classification error.
.margin [1x1] Margin of the found classifier.
.cputime [1x1] Used CPU time in seconds.
.options [struct] Copy of used options.
Example:
trn = load('riply_trn');
model = smo(trn,struct('ker','rbf','C',10,'arg',1));
figure; ppatterns(trn); psvm(model);
tst = load('riply_tst');
ypred = svmclass( tst.X, model );
cerror( ypred, tst.y )
See also
SVMCLASS, SVMLIGHT, SVMQUADPROG.
About: Statistical Pattern Recognition Toolbox
(C) 1999-2003, Written by Vojtech Franc and Vaclav Hlavac
Czech Technical University Prague
Faculty of Electrical Engineering
Center for Machine Perception
Modifications:
23-may-2004, VF
17-September-2001, V. Franc, created