Advanced SearchSearch Tips
Semi-Supervised Learning by Gaussian Mixtures
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Semi-Supervised Learning by Gaussian Mixtures
Choi, Byoung-Jeong; Chae, Youn-Seok; Choi, Woo-Young; Park, Chang-Yi; Koo, Ja-Yong;
  PDF(new window)
Discriminant analysis based on Gaussian mixture models, an useful tool for multi-class classifications, can be extended to semi-supervised learning. We consider a model selection problem for a Gaussian mixture model in semi-supervised learning. More specifically, we adopt Bayesian information criterion to determine the number of subclasses in the mixture model. Through simulations, we illustrate the usefulness of the criterion.
BIC;classification;density estimation;EM algorithm;Gaussian mixture;
 Cited by
Breiman, L., Fredman, J., Olshen, R. and Stone, C. (1984). Classification and Regression Trees, Wadsworth, Belmont

Dempster, A. P., Laird, N. M. and Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm, Jornal of the Royal Statistical Society, Series B, 39, 1-38

Halbe, Z. and Aladjem, M. (2005). Model-based mixture discriminant analysis|an experimental study, Pattern Recognition, 38, 437-440 crossref(new window)

Hastie, T. and Tibshirani, R. (1996). Discriminant analysis by Gaussian Mixtures, Journal of the Royal Statististical Society, Serise B, 58, 158-176

Nigam, K., McCallum, A. K., Thrun, S. and Mitchell, T. (2000). Text classification from labeled and unlabeled documents using EM, Machine Learning, 39, 103-134 crossref(new window)

Zhu X. (2005). Semi-Supervised Learning Literature Survey, Technical Report 1530, Computer Sciences, University of Wisconsin-Madison