Advanced SearchSearch Tips
Multinomial Kernel Logistic Regression via Bound Optimization Approach
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Multinomial Kernel Logistic Regression via Bound Optimization Approach
Shim, Joo-Yong; Hong, Dug-Hun; Kim, Dal-Ho; Hwang, Chang-Ha;
  PDF(new window)
Multinomial logistic regression is probably the most popular representative of probabilistic discriminative classifiers for multiclass classification problems. In this paper, a kernel variant of multinomial logistic regression is proposed by combining a Newton's method with a bound optimization approach. This formulation allows us to apply highly efficient approximation methods that effectively overcomes conceptual and numerical problems of standard multiclass kernel classifiers. We also provide the approximate cross validation (ACV) method for choosing the hyperparameters which affect the performance of the proposed approach. Experimental results are then presented to indicate the performance of the proposed procedure.
Approximate cross validation;hyperparameters;multinomial logistic regression;support vector machine;
 Cited by
Semiparametric Kernel Poisson Regression for Longitudinal Count Data,Hwang, Chang-Ha;Shim, Joo-Yong;

Communications for Statistical Applications and Methods, 2008. vol.15. 6, pp.1003-1011 crossref(new window)
Multiclass Classification via Least Squares Support Vector Machine Regression,Shim, Joo-Yong;Bae, Jong-Sig;Hwang, Chang-Ha;

Communications for Statistical Applications and Methods, 2008. vol.15. 3, pp.441-450 crossref(new window)
Blake, C. L. and Merz, C. J. (1998). UCI Repository of machine learning databases. University of California, Department of Information and Computer Science. Available from: http://www.ics.ucLedu/ mlearn/MLRepository.html

Bohning, D. (1992). Multinomial logistic regression algorithm. Annals of the Institute of Statistical Mathematics, 44, 197-200 crossref(new window)

Craven, P. and Wahba, G. (1979). Smoothing noisy data with spline functions: estimating the correct degree of smoothing by the method of generalized cross-validation. Numerische Mathematic, 31, 317-403

Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95 crossref(new window)

Krishnapuram, B., Carin, L., Figueiredo, M. A. T. and Hartemink, A. J. (2005). Sparse multinomiallogistic regression: fast algorithms and generalization bounds. IEEE Ttransaction on Pattern Analysis and Machine Intelligence, 27, 957-968 crossref(new window)

Mercer, J. (1909). Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society of London, 209, 415-446 crossref(new window)

Minka, T. (2003). A comparison of numerical optimizers for logistic regression. Technical Report, Department of Statistics, Carnegie Mellon University

Rifkin, R. and Klautau, A. (2004). In defense of one-vs-all classification. Journal of Machine Learning Research, 5, 101-141

Suykens, J. A. K. and Vandewalle, J. (1999). Multiclass least squares support vector machines, Proceeding of the International Joint Conference on Neural Networks, 900-903

Vapnik, V. N. (1995). The Nature of Statistical Learning Theory. Springer-Verlag, New York

Vapnik, V. N. (1998). Statistical Learning Theory. Springer-Verlag, New York

Wahba, G., Lin, Y., and Zhang, H. (1999). Generalized approximate cross validation for support vector machine, or, another way to look at margin-Like quantities. Technical Report No. 1006, University of Wisconsin

Weston, J. and Watkins, C. (1998). Multi-class SVM. Technical Report 98-04, Royal Holloway University of London