JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Multiclass LS-SVM ensemble for large data
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Multiclass LS-SVM ensemble for large data
Hwang, Hyungtae;
  PDF(new window)
 Abstract
Multiclass classification is typically performed using the voting scheme method based on combining binary classifications. In this paper we propose multiclass classification method for large data, which can be regarded as the revised one-vs-all method. The multiclass classification is performed by using the hat matrix of least squares support vector machine (LS-SVM) ensemble, which is obtained by aggregating individual LS-SVM trained on each subset of whole large data. The cross validation function is defined to select the optimal values of hyperparameters which affect the performance of multiclass LS-SVM proposed. We obtain the generalized cross validation function to reduce computational burden of cross validation function. Experimental results are then presented which indicate the performance of the proposed method.
 Keywords
Ensemble;generalized cross validation function;hat matrix;least squares support vector machine;multiclass classification;
 Language
English
 Cited by
1.
LS-SVM for large data sets, Journal of the Korean Data and Information Science Society, 2016, 27, 2, 549  crossref(new windwow)
 References
1.
Breiman, L. (1996). Bagging predictors. Machine Learning, 24, 123-140.

2.
Espinoza, M., Suykens, J.A.K. and De Moor, B. (2005). Load forecasting using least squares Support vector machines. Lecture Notes in Computer Science, 3512, 1018-1026.

3.
Girolami, M. (2003). Orthogonal series density estimation and kernel eigenvalue problem. Neural Computation, 14, 669-688.

4.
Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95. crossref(new window)

5.
Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society of London A, 415-446.

6.
Scholkopf, B., Burges, C. and Vapnik, V. (1995). Extracting support data for a given task. In Proceedings of First Conference on Knowledge Discovery and Data Mining , 252-257, Menlo Park, CA.

7.
Seok, K. H. (2014). Semi-supervised classification with LS-SVM formulation. Journal of the Korean Data & Information Science Society, 21, 461-470.

8.
Shim, J. and Hwang, C. (2013). Expected shortfall estimation using kernel machines. Journal of the Korean Data & Information Science Society, 24, 625-636. crossref(new window)

9.
Suykens, J. A. K. and Vanderwalle, J. (1999). Least square support vector machine classifier. Neural Processing Letters, 9, 293-300. crossref(new window)

10.
Suykens, J. A. K. and Vandewalle, J. (1999). Multiclass least squares support vector machines. In Proceeding of the International Joint Conference on Neural Networks,, 900-903, Washington DC.

11.
Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.

12.
Vapnik, V. N. (1998). Statistical learning theory, Springer, New York.

13.
Weston, J. and Watkins, C. (1998). Multi-class SVM, Technical Report 98-04, Royal Holloway University, London.

14.
Williams, C. K. I. and Seeger, M. (2001). Using the Nystrom method to speed up kernel machines. In Proceeding of Neural Information Processing Systems Conference 13, 682-699, MIT press.