DOI QR코드

DOI QR Code

Sparse Multinomial Kernel Logistic Regression

Shim, Joo-Yong;Bae, Jong-Sig;Hwang, Chang-Ha

  • Published : 2008.01.31

Abstract

Multinomial logistic regression is a well known multiclass classification method in the field of statistical learning. More recently, the development of sparse multinomial logistic regression model has found application in microarray classification, where explicit identification of the most informative observations is of value. In this paper, we propose a sparse multinomial kernel logistic regression model, in which the sparsity arises from the use of a Laplacian prior and a fast exact algorithm is derived by employing a bound optimization approach. Experimental results are then presented to indicate the performance of the proposed procedure.

Keywords

Bound optimization;Laplacian regularization;multinomial logistic regression;sparsity;support vector machine

References

  1. Bohning, D. (1992). Multinomial logistic regression algorithm. Annals of the Institute of Statistical Mathematics, 44, 197-200 https://doi.org/10.1007/BF00048682
  2. Cawley, G. C., Talbot, N. L. C. and Girolami, M. (2006). Sparse multinomial logistic regression via Bayesian L1 regularisation. Advances in Neural Information Processing Systems, 18, 609-616
  3. Lawrence, N. D., Seeger, M. and Herbrich, R. (2003). Fast sparse Gaussian process methods: the informative vector machine. Advances in Neural Information Processing Systems, 15, 609-616
  4. Mercer, J. (1909). Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society of London, 209, 415-446 https://doi.org/10.1098/rsta.1909.0016
  5. Csato, L. and Opper, M. (2002). Sparse online Gaussian processes. Neural Computation, 14, 641-668 https://doi.org/10.1162/089976602317250933
  6. Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95 https://doi.org/10.1016/0022-247X(71)90184-3
  7. Tipping, M. (2001). Sparse Bayesian learning and the relevance vector machine. Journal of Machine Learning Research, 1, 211-244 https://doi.org/10.1162/15324430152748236
  8. Vapnik, V. N. (1995). The Nature of Statistical Learning Theory. Springer-Verlag, New York
  9. Minka, T. (2003). A comparison of numerical optimizers for logistic regression. Technical Report, Department of Statistics, Carnegie Mellon University
  10. Rifkin, R. and Klautau, A. (2004). In defense of one-vs-all classification. Journal of Machine Learning Research, 5, 101-141
  11. Krishnapuram, B., Carin, L., Figueiredo, M. A. T. and Hartemink, A. J. (2005). Sparse multi-nomial logistic regression: fast algorithms and generalization bounds. IEEE Ttransaction on Pattern Analysis and Machine Intelligence, 27, 957-968 https://doi.org/10.1109/TPAMI.2005.127