JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Multinomial Kernel Logistic Regression via Bound Optimization Approach
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Multinomial Kernel Logistic Regression via Bound Optimization Approach
Shim, Joo-Yong; Hong, Dug-Hun; Kim, Dal-Ho; Hwang, Chang-Ha;
  PDF(new window)
 Abstract
Multinomial logistic regression is probably the most popular representative of probabilistic discriminative classifiers for multiclass classification problems. In this paper, a kernel variant of multinomial logistic regression is proposed by combining a Newton's method with a bound optimization approach. This formulation allows us to apply highly efficient approximation methods that effectively overcomes conceptual and numerical problems of standard multiclass kernel classifiers. We also provide the approximate cross validation (ACV) method for choosing the hyperparameters which affect the performance of the proposed approach. Experimental results are then presented to indicate the performance of the proposed procedure.
 Keywords
Approximate cross validation;hyperparameters;multinomial logistic regression;support vector machine;
 Language
English
 Cited by
1.
Semiparametric Kernel Poisson Regression for Longitudinal Count Data,;;

Communications for Statistical Applications and Methods, 2008. vol.15. 6, pp.1003-1011 crossref(new window)
2.
Multiclass Classification via Least Squares Support Vector Machine Regression,;;;

Communications for Statistical Applications and Methods, 2008. vol.15. 3, pp.441-450 crossref(new window)
 References
1.
Blake, C. L. and Merz, C. J. (1998). UCI Repository of machine learning databases. University of California, Department of Information and Computer Science. Available from: http://www.ics.ucLedu/ mlearn/MLRepository.html

2.
Bohning, D. (1992). Multinomial logistic regression algorithm. Annals of the Institute of Statistical Mathematics, 44, 197-200 crossref(new window)

3.
Craven, P. and Wahba, G. (1979). Smoothing noisy data with spline functions: estimating the correct degree of smoothing by the method of generalized cross-validation. Numerische Mathematic, 31, 317-403

4.
Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95 crossref(new window)

5.
Krishnapuram, B., Carin, L., Figueiredo, M. A. T. and Hartemink, A. J. (2005). Sparse multinomiallogistic regression: fast algorithms and generalization bounds. IEEE Ttransaction on Pattern Analysis and Machine Intelligence, 27, 957-968 crossref(new window)

6.
Mercer, J. (1909). Functions of positive and negative type and their connection with the theory of integral equations. Philosophical Transactions of the Royal Society of London, 209, 415-446 crossref(new window)

7.
Minka, T. (2003). A comparison of numerical optimizers for logistic regression. Technical Report, Department of Statistics, Carnegie Mellon University

8.
Rifkin, R. and Klautau, A. (2004). In defense of one-vs-all classification. Journal of Machine Learning Research, 5, 101-141

9.
Suykens, J. A. K. and Vandewalle, J. (1999). Multiclass least squares support vector machines, Proceeding of the International Joint Conference on Neural Networks, 900-903

10.
Vapnik, V. N. (1995). The Nature of Statistical Learning Theory. Springer-Verlag, New York

11.
Vapnik, V. N. (1998). Statistical Learning Theory. Springer-Verlag, New York

12.
Wahba, G., Lin, Y., and Zhang, H. (1999). Generalized approximate cross validation for support vector machine, or, another way to look at margin-Like quantities. Technical Report No. 1006, University of Wisconsin

13.
Weston, J. and Watkins, C. (1998). Multi-class SVM. Technical Report 98-04, Royal Holloway University of London