Advanced SearchSearch Tips
LS-SVM for large data sets
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
LS-SVM for large data sets
Park, Hongrak; Hwang, Hyungtae; Kim, Byungju;
  PDF(new window)
In this paper we propose multiclassification method for large data sets by ensembling least squares support vector machines (LS-SVM) with principal components instead of raw input vector. We use the revised one-vs-all method for multiclassification, which is one of voting scheme based on combining several binary classifications. The revised one-vs-all method is performed by using the hat matrix of LS-SVM ensemble, which is obtained by ensembling LS-SVMs trained using each random sample from the whole large training data. The leave-one-out cross validation (CV) function is used for the optimal values of hyper-parameters which affect the performance of multiclass LS-SVM ensemble. We present the generalized cross validation function to reduce computational burden of leave-one-out CV functions. Experimental results from real data sets are then obtained to illustrate the performance of the proposed multiclass LS-SVM ensemble.
Ensemble;generalized cross validation function;least squares support vector machine;multiclassification;one-vs-all method;principal components;random sample;
 Cited by
Breiman, L. (1996). Bagging predictors. Machine Learning, 24, 123-140.

Breiman, L., Friedman, J., Olshen, R. and Stone, C. (1984). Classification and regression trees, Wadsworth, New York.

Espinoza, M., Suykens, J. A. K. and De Moor, B. (2005). Load forecasting using least squares Support vector machines. Lecture Notes in Computer Science, 3512, 1018-1026.

Girolami, M. (2003). Orthogonal series density estimation and kernel eigenvalue problem. Neural Computation, 14, 669-688.

Ghosh, J. (2002). Multiclassifier systems: Back to the future. Lecture Note in Computer Science, 2364, 1-15.

Hastie, T. and Tibshirani, R. (1998). Classification by pairwise coupling. The Annals of Statistics, 26, 451-471. crossref(new window)

Hwang, H. (2015). Multiclass LS-SVM ensemble for large data. Journal of the Korean Data & Information Science Society, 26, 1557-1563. crossref(new window)

Jolliffe, I. T. (2002). Principal component analysis (2nd edition), Springer, New York.

Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society A, 209, 415-446. crossref(new window)

Seok, K. H. (2010). Semi-supervised classification with LS-SVM formulation. Journal of the Korean Data & Information Science Society, 21, 461-470.

Shim, J. and Hwang, C. (2015). Varying coeffcient modeling via least squares support vector regression. Neurocomputing, 161, 254-259. crossref(new window)

Shim, J. and Seok, K. H. (2014). A transductive least squares support vector machine with the difference convex algorithm. Journal of the Korean Data & Information Science Society, 25, 455-464. crossref(new window)

Suykens, J. A. K. and Vanderwalle, J. (1999a). Least square support vector machine classifier, Neural Processing Letters, 9, 293-300. crossref(new window)

Suykens, J. A. K. and Vandewalle, J. (1999b). Multiclass least squares support vector machines. In Proceeding of the International Joint Conference on Neural Networks, 900-903, IEEE, Washington, D. C..

Suykens, J. A. K., Vandewalle, J. and DeMoor, B. (2001). Optimal control by least squares support vector machines. Neural Networks, 14, 23-35. crossref(new window)

Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.

Vapnik, V. N. (1998). Statistical learning theory, Springer, New York.

Wahba, G. (1990). Spline models for observational data, CMMS-NSF Regional Conference Series in Applied Mathematics, 59, SIAM, Philadelphia.

Weston, J. and Watkins, C. (1998). Multi-class SVM, Technical Report 98-04, Royal Holloway University, London.

Williams, C. K. I. and Seeger, M. (2001). Using the Nystrom method to speed up kernel machines. In Proceeding of Neural Information Processing Systems Conference 13, 682-699, MIT Press, Cambridge, London.