JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Deep LS-SVM for regression
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Deep LS-SVM for regression
Hwang, Changha; Shim, Jooyong;
  PDF(new window)
 Abstract
In this paper, we propose a deep least squares support vector machine (LS-SVM) for regression problems, which consists of the input layer and the hidden layer. In the hidden layer, LS-SVMs are trained with the original input variables and the perturbed responses. For the final output, the main LS-SVM is trained with the outputs from LS-SVMs of the hidden layer as input variables and the original responses. In contrast to the multilayer neural network (MNN), LS-SVMs in the deep LS-SVM are trained to minimize the penalized objective function. Thus, the learning dynamics of the deep LS-SVM are entirely different from MNN in which all weights and biases are trained to minimize one final error function. When compared to MNN approaches, the deep LS-SVM does not make use of any combination weights, but trains all LS-SVMs in the architecture. Experimental results from real datasets illustrate that the deep LS-SVM significantly outperforms state of the art machine learning methods on regression problems.
 Keywords
Deep learning;hidden layer;least squares support vector machine;multilayer neural network;penalized objective function;
 Language
English
 Cited by
 References
1.
Bengio, Y. and Le Cun, Y. (2007). Scaling learning algorithms towards AI. In Large Scale Kernel Machines, edited by Bottou, L., Chapelle, O., De Coste, D., and Weston, J., MIT Press, Cambridge.

2.
Cho, Y. and Saul, S. K. (2009). Kernel methods for deep learning. Advances in Neural Information Processing Systems, 22, 342-350.

3.
Hwang, C. (2014). Support vector quantile regression for autoregressive data. Journal of the Korean Data & Information Science Society, 25, 1539-1547. crossref(new window)

4.
Hwang, C. (2015). Partially linear support vector orthogonal quantile regression with measurement errors. Journal of the Korean Data & Information Science Society, 26, 209-216. crossref(new window)

5.
Hwang, C. (2016). Multioutput LS-SVR based residual MCUSUM control chart for autocorrelated process. Journal of the Korean Data & Information Science Society, 27, 523-530. crossref(new window)

6.
Li, D., Tian, Y. and Xu, H. (2014). Deep twin support vector machine. In Proceedings of IEEE International Conference on Data Mining Workshop, 65-73, IEEE, Shenzhen, China.

7.
Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society A, 209, 415-446. crossref(new window)

8.
Rumelhart, D. E., Hinton, G. E. and Williams, R. J. (1986). Learning internal representations by error propagation. Nature, 323, 533-536. crossref(new window)

9.
Seok, K. (2015). Semisupervised support vector quantile regression. Journal of the Korean Data & Information Science Society, 26, 517-524. crossref(new window)

10.
Shim, J. and Seok, K. (2014). A transductive least squares support vector machine with the difference convex algorithm. Journal of the Korean Data & Information Science Society, 25, 455-464. crossref(new window)

11.
Suykens, J. A. K. and Vanderwalle, J. (1999). Least square support vector machine classifier. Neural Pro-cessing Letters, 9, 293-300. crossref(new window)

12.
Suykens, J. A. K., Vandewalle, J. and DeMoor, B. (2001). Optimal control by least squares support vector machines. Neural Networks, 14, 23-35. crossref(new window)

13.
Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.

14.
Wahba, G. (1990). Spline models for observational data, CMMS-NSF Regional Conference Series in Applied Mathematics, 59, SIAM, Philadelphia.

15.
Wiering, M. A. and Schomaker, L. R. B. (2014). Multi-layer support vector machines. In Regularization, Optimization, Kernels, and Support Vector Machines, edited by Suykens, Signoretto and Argyriou, Chapman & Hall/CRC, Boca Raton.

16.
Zhuang, Z., Tsang, I. W. and Choi, S. C. H. (2011). Two-layer multiple kernel learning. In Proceedings of International Conference on Artificial Intelligence and Statistics, 909-917.