JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Semi-Supervised Learning by Gaussian Mixtures
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Semi-Supervised Learning by Gaussian Mixtures
Choi, Byoung-Jeong; Chae, Youn-Seok; Choi, Woo-Young; Park, Chang-Yi; Koo, Ja-Yong;
  PDF(new window)
 Abstract
Discriminant analysis based on Gaussian mixture models, an useful tool for multi-class classifications, can be extended to semi-supervised learning. We consider a model selection problem for a Gaussian mixture model in semi-supervised learning. More specifically, we adopt Bayesian information criterion to determine the number of subclasses in the mixture model. Through simulations, we illustrate the usefulness of the criterion.
 Keywords
BIC;classification;density estimation;EM algorithm;Gaussian mixture;
 Language
Korean
 Cited by
 References
1.
Breiman, L., Fredman, J., Olshen, R. and Stone, C. (1984). Classification and Regression Trees, Wadsworth, Belmont

2.
Dempster, A. P., Laird, N. M. and Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm, Jornal of the Royal Statistical Society, Series B, 39, 1-38

3.
Halbe, Z. and Aladjem, M. (2005). Model-based mixture discriminant analysis|an experimental study, Pattern Recognition, 38, 437-440 crossref(new window)

4.
Hastie, T. and Tibshirani, R. (1996). Discriminant analysis by Gaussian Mixtures, Journal of the Royal Statististical Society, Serise B, 58, 158-176

5.
Nigam, K., McCallum, A. K., Thrun, S. and Mitchell, T. (2000). Text classification from labeled and unlabeled documents using EM, Machine Learning, 39, 103-134 crossref(new window)

6.
Zhu X. (2005). Semi-Supervised Learning Literature Survey, Technical Report 1530, Computer Sciences, University of Wisconsin-Madison