MPERCEPTRON |
Perceptron algorithm to train linear machine.
Synopsis:
model = mpeceptron(data)
model = mpeceptron(data,options)
model = mpeceptron(data,options,init_model)
Description:
model = mperceptron(data) uses the Perceptron learning rule
to train linear machine (multi-class linear classitfier).
The multi-class problem is transformed to the single-class
one using the Kessler's construction [DHS01][SH10].
model = mperceptron(data,options) specifies stopping condition of
the algorithm in structure options:
.tmax [1x1]... maximal number of iterations.
model = mperceptron(data,options,init_model) specifies initial
model which must contain:
.W [dim x nclass] ... Normal vectors.
.b [nclass x 1] ... Biases.
Input:
data [struct] Labeled training data:
.X [dim x num_data] Training vectors.
.y [1 x num_data] Labels (1,2,...,nclass).
options [struct]
.tmax [1x1] Maximal number of iterations (default tmax=inf).
init_model [struct] Initial model; must contain items .W, .b.
Output:
model [struct] Multi-class linear classifier:
.W [dim x nclass] Normal vectors.
.b [nclass x 1] Biases.
.exitflag [1x1] 1 ... perceptron has converged.
0 ... number of iterations exceeded tmax.
.t [1x1] Number of iterations.
Example:
data = load('pentagon');
model = mperceptron( data );
figure; ppatterns( data ); pboundary( model );
See also
PERCEPTRON, LINCLASS, EKOZINEC.
Modifications:
21-may-2004, VF
18-may-2004, VF