DOI QR코드

DOI QR Code

Multiclass LS-SVM ensemble for large data

  • Received : 2015.08.28
  • Accepted : 2015.11.24
  • Published : 2015.11.30

Abstract

Multiclass classification is typically performed using the voting scheme method based on combining binary classifications. In this paper we propose multiclass classification method for large data, which can be regarded as the revised one-vs-all method. The multiclass classification is performed by using the hat matrix of least squares support vector machine (LS-SVM) ensemble, which is obtained by aggregating individual LS-SVM trained on each subset of whole large data. The cross validation function is defined to select the optimal values of hyperparameters which affect the performance of multiclass LS-SVM proposed. We obtain the generalized cross validation function to reduce computational burden of cross validation function. Experimental results are then presented which indicate the performance of the proposed method.

Keywords

References

  1. Breiman, L. (1996). Bagging predictors. Machine Learning, 24, 123-140.
  2. Espinoza, M., Suykens, J.A.K. and De Moor, B. (2005). Load forecasting using least squares Support vector machines. Lecture Notes in Computer Science, 3512, 1018-1026.
  3. Girolami, M. (2003). Orthogonal series density estimation and kernel eigenvalue problem. Neural Computation, 14, 669-688.
  4. Kimeldorf, G. S. and Wahba, G. (1971). Some results on Tchebycheffian spline functions. Journal of Mathematical Analysis and its Applications, 33, 82-95. https://doi.org/10.1016/0022-247X(71)90184-3
  5. Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society of London A, 415-446.
  6. Scholkopf, B., Burges, C. and Vapnik, V. (1995). Extracting support data for a given task. In Proceedings of First Conference on Knowledge Discovery and Data Mining , 252-257, Menlo Park, CA.
  7. Seok, K. H. (2014). Semi-supervised classification with LS-SVM formulation. Journal of the Korean Data & Information Science Society, 21, 461-470.
  8. Shim, J. and Hwang, C. (2013). Expected shortfall estimation using kernel machines. Journal of the Korean Data & Information Science Society, 24, 625-636. https://doi.org/10.7465/jkdi.2013.24.3.625
  9. Suykens, J. A. K. and Vanderwalle, J. (1999). Least square support vector machine classifier. Neural Processing Letters, 9, 293-300. https://doi.org/10.1023/A:1018628609742
  10. Suykens, J. A. K. and Vandewalle, J. (1999). Multiclass least squares support vector machines. In Proceeding of the International Joint Conference on Neural Networks,, 900-903, Washington DC.
  11. Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.
  12. Vapnik, V. N. (1998). Statistical learning theory, Springer, New York.
  13. Weston, J. and Watkins, C. (1998). Multi-class SVM, Technical Report 98-04, Royal Holloway University, London.
  14. Williams, C. K. I. and Seeger, M. (2001). Using the Nystrom method to speed up kernel machines. In Proceeding of Neural Information Processing Systems Conference 13, 682-699, MIT press.

Cited by

  1. LS-SVM for large data sets vol.27, pp.2, 2016, https://doi.org/10.7465/jkdi.2016.27.2.549