Examples: Statistical Pattern Recognition Toolbox | Home |
This demo shows algorithms learning separating hyperplane for binary separable data, e.g., Perceptron, Kozinec's algorithm, linear SVM. The demo allows to create interactively a simple examples and to compare different algorithms. | |
The Generalized Anderson's task belongs to a class of non-Bayesian approaches for classification. The class-conditional probabilities are assumed to be influenced by a non-random intervention. The minimax approach is used to design a classifier prepared for the worst possible intervention. The demo allows to create interactively a simple examples and to compare different algorithms to solve the task. | |
The demo allows to interactively define a toy training sets and to train the SVM classifier with different kernels and regularization constants. | |
The demo shows the EM algorithm used for estimation of parameters of the Gaussian mixture model. | |
The demo shows the minimax algorithm to estimate parameters of multivariate Gaussian distribution. | |
The example shows application of the Perceptron rule to train the multi-class linear classifier using the Kesler's construction. | |
The figure shows the Principal Component Analysis used to find the 1D representation of the input 2D data with the minimal reconstruction error. | |
The example shows a difference between the Linear Discriminant Analysis and the Principal Component Analysis used for feature extraction. | |
The example shows the greedy kernel PCA algorithm used to model the training data. | |
The figure shows quadratic classifier found by the Perceptron algorithm on the data mapped to the feature by the quadratic mapping. | |
The example shows fitting of a posteriori probability to the SVM output. The sigmoid function is fitted by ML estimation and the Gaussian model is used for comparison. | |
The figure shows data clustering found by the K-means algorithm. | |
The figure showing the multi-class BSVM classifier with L2-soft margin. | |
The figure shows the binary classifier trained based on the Kernel Fisher Discriminant. | |
The figure shows the decision boundary of the SVM classifier and its approximation computed by the reduced set method. The original decision rule involves 94 support vectors while the reduced one only 10 support vectors. | |
The figure shows the decision boundary of the Bayesian classifier (solid line) and the decision boundary of the reject-option rule with (dashed line). The class-conditional distributions are model by the Gaussian mixture models estimated by the EM algorithm. | |
The figure shows the decision boundary of the (K=8)-nearest neighbors classifier. | |
The toolbox provides means to design the OCR system: |
The figures show the OCR for the hand-written numerals base on the multi-class SVM. The toolbox provides a simple GUI which allows to draw the numerals by a standard mouse. |
The figure shows the idea of using the kernel PCA to model for image denoising. | |
The figures shows application of kernel PCA for denoising of the USPS hand-written numerals corrupted by the Gaussian noise. |
Ground truth | ||
Noisy images | ||
Linear PCA | ||
Kernel PCA |