DEMO_EMGMM |
Demo on Expectation-Maximization (EM) algorithm.
Synopsis:
demo_emgmm
Description:
This demo shows the Expectation-Maximization (EM) algorithm
[Schles68][DLR77] for Gaussians mixture model (GMM). The EM
fits the GMM to i.i.d. sample data (in this case only 2D)
such that the likelihood is maximized.
The found model is described by ellipsoids (shape of
covariances) and a crosses (mean value vectors). The value
of the optimized log-likelihood function for the current estimate
is displayed in the bottom part.
Control:
Covariance - Determines type of the covariance matrix:
Diagonal (independent features),
Full (correlated features).
Components - Number of components (Gaussians) in the mixture.
Iterations - Number of iterations in one step.
Random init - the initial model is randomly generated and/or
first n training samples are taken as the
mean vectors.
FIG2EPS - Export screen to the PostScript file.
Save model - Save current model to file.
Load data - Load input point sets from file.
Create data - Invoke program for creating point sets.
Reset - Set the tested algorithm to the initial state.
Play - Run the tested algorithm.
Stop - Stop the running algorithm.
Step - Perform only one step.
Info - Info box.
Close - Close the program.
See also EMGMM.
About: Statistical Pattern Recognition Toolbox
(C) 1999-2003, Written by Vojtech Franc and Vaclav Hlavac
Czech Technical University Prague
Faculty of Electrical Engineering
Center for Machine Perception
Modifications:
19-sep-2003, VF
11-june-2001, V.Franc, comments added.
27.02.00 V. Franc
5. 4.00 V. Franc
23.06.00 V. Hlavac Comments polished. Message when no data loaded.
Export of the solution to global variables.
27-mar-2001, V.Franc, Graph og log-likelihood function added