DOI QR코드

DOI QR Code

GACV for partially linear support vector regression

  • Received : 2013.02.19
  • Accepted : 2013.03.10
  • Published : 2013.03.31

Abstract

Partially linear regression is capable of providing more complete description of the linear and nonlinear relationships among random variables. In support vector regression (SVR) the hyper-parameters are known to affect the performance of regression. In this paper we propose an iterative reweighted least squares (IRWLS) procedure to solve the quadratic problem of partially linear support vector regression with a modified loss function, which enables us to use the generalized approximate cross validation function to select the hyper-parameters. Experimental results are then presented which illustrate the performance of the partially linear SVR using IRWLS procedure.

Keywords

References

  1. Cho, D. H., Shim, J. and Seok, K. H. (2010). Doubly penalized kernel method for heteroscedastic autoregressive data. Journal of the Korean Data & Information Science Society, 21, 155-162.
  2. Hwang, H. (2010). Fixed size LS-SVM for multiclassi cation problems of large datasets. Journal of the Korean Data & Information Science Society, 21, 561-567.
  3. Kuhn, H. W. and Tucker, A. W. (1951). Nonlinear programming. Proceedings of 2nd Berkeley Symposium, 481-492.
  4. Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society A, 415-446.
  5. Nychka, D., Gray, G., Haaland, P., Martin, D. and O'Connell, M. (1995). A nonparametric approach syringe grading for quality improvement. Journal of American Statistical Association, 432, 1171-1178.
  6. Perez-Cruz, F., Navia-Vazquez, A., Alarcon-Diana, P. L. and Artes-Rodriguez, A. (2000). An IRWLS procedure for SVR. In Proceedings of European Association for Signal Processing, EUSIPO 2000, Tampere, Finland.
  7. Platt, J. (1998). Sequential minimal optimization: A fast algorithm for training support vector machines, Technical Report MSR-TR-98-14, Microsoft Research, California.
  8. Shim, J., Kim, C. and Hwang, C. (2011). Semiparametric least squares support vector machine for accelerated failure time model. Journal of the Korean Statistical Society, 40, 75-83. https://doi.org/10.1016/j.jkss.2010.05.002
  9. Smola, A. J. and Scholkopf, B. (1998). On a kernel-based method for pattern recognition, regression, approximation and operator inversion. Algorithmica, 22, 211-231. https://doi.org/10.1007/PL00013831
  10. Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.
  11. Vapnik, V. N. (1998). Statistical learning theory, John Wiley, New York.
  12. Wahba, G., Lin, Y. and Zhang, H. (1999). Generalized approximate cross validation for support vector machines, or another way to look at margin-like quantities, Technical Report 1006, University of Wisconsin, Wisconsin.
  13. Wang, L.(Ed.) (2005). Support vector machines: Theory and application, Springer, New York.
  14. Yuan, M. (2006). GACV for quantile smoothing splines. Computational Statistics and Data Analysis, 50, 813-829. https://doi.org/10.1016/j.csda.2004.10.008

Cited by

  1. Support vector quantile regression for autoregressive data vol.25, pp.6, 2014, https://doi.org/10.7465/jkdi.2014.25.6.1539
  2. Classification of universities in Daegu·Gyungpook by support vector cluster analysis vol.24, pp.4, 2013, https://doi.org/10.7465/jkdi.2013.24.4.783
  3. A polychotomous regression model with tensor product splines and direct sums vol.25, pp.1, 2014, https://doi.org/10.7465/jkdi.2014.25.1.19