DOI QR코드

DOI QR Code

Semisupervised support vector quantile regression

  • Received : 2015.02.11
  • Accepted : 2015.03.16
  • Published : 2015.03.31

Abstract

Unlabeled examples are easier and less expensive to be obtained than labeled examples. In this paper semisupervised approach is used to utilize such examples in an effort to enhance the predictive performance of nonlinear quantile regression problems. We propose a semisupervised quantile regression method named semisupervised support vector quantile regression, which is based on support vector machine. A generalized approximate cross validation method is used to choose the hyper-parameters that affect the performance of estimator. The experimental results confirm the successful performance of the proposed S2SVQR.

Keywords

References

  1. Blum, A. and Mitchell, T. (1998). Combining labeled and unlabeled data with co-training. Proceedings of the 11th Annual Conference on Computational Learning Theory, 92-100, Madison, Wisconsin, United States.
  2. Chapelle, O., Sindhwani, V. and Keerthi, S. (2008). Optimization techniques for semisupervised support vector machines. Journal of Machine Learning Research, 9, 203-233.
  3. Chen, Y., Wang, G. and Dong, S. (2002). Learning with progressive transductive support vector machine. Proceedings of International Conference on Data Mining, 67-74, Maebashi City, Japan.
  4. Hwang, H. (2010). Fixed size LS-SVM for multiclassification problems of large datasets. Journal of the Korean Data & Information Science Society, 21, 561-567.
  5. Koenker, R. and Bassett, G. (1978). Regression quantile. Econometrica, 46, 33-50. https://doi.org/10.2307/1913643
  6. Koenker, R. (2005). Quantile regression, Cambridge University Press.
  7. Kuhn, H. W. and Tucker, A. W. (1951). Nonlinear programming. In Proceedings of 2nd Berkeley Symposium, 481-492, University of California Press, Berkeley.
  8. Li, Y., Liu, Y. and Zhu, J. (2007). Quantile regression in reproducing kernel hilbert spaces. Journal of the American Statistical Association, 102, 255-268. https://doi.org/10.1198/016214506000000979
  9. Mercer, J. (1909). Functions of positive and negative type and their connection with theory of integral equations. Philosophical Transactions of Royal Society A, 415-446.
  10. Seok, K. H. (2010). Semi-supervised classification with LS-SVM formulation. Journal of the Korean Data & Information Science Society, 21, 461-470.
  11. Seok, K. H. (2012). Study on semi-supervised local constant regression estimation. Journal of the Korean Data & Information Science Society, 23, 579-585. https://doi.org/10.7465/jkdi.2012.23.3.579
  12. Seok, K. H. (2014). Semi-supervised regression based on support vector machine Journal of the Korean Data & Information Science Society, 25, 447-454. https://doi.org/10.7465/jkdi.2014.25.2.447
  13. Seok, K. H. (2013). A study on semi-supervised kernel ridge regression estimation. Journal of the Korean Data & Information Science Society, 24, 341-53. https://doi.org/10.7465/jkdi.2013.24.2.341
  14. Shim, J. and Hwang, C. (2009). Support vector censored quantile regression under random censoring. Computational Statistics and Data Analysis, 53, 912-917. https://doi.org/10.1016/j.csda.2008.10.037
  15. Smola, A. and Scholkopf, B. (1998). On a Kernel-based method for pattern recognition, regression, approximation and operator inversion. Algorithmica, 22, 211-231. https://doi.org/10.1007/PL00013831
  16. Suykens, J. A. K. and Vanderwalle, J. (1999). Least square support vector machine classifier. Neural Pro-cessing Letters, 9, 293-300. https://doi.org/10.1023/A:1018628609742
  17. Vapnik, V. N. (1995). The nature of statistical learning theory, Springer, New York.
  18. Vapnik, V. N. (1998). Statistical learning theory, John Wiley, New York.
  19. Wang, L.(Ed.) (2005). Support vector machines: Theory and application, Springer, Berlin Heidelberg, New York.
  20. Wang, J., Shen, X. and Pan, W. (2007). On transductive support vector machine. Contemporary Mathematics, 43, 7-19.
  21. Xu, S., An, X., Qiao, X., Zhu, L. and Li, L. (2011). Semisupervised least squares support vector regression machines. Journal of Information & Computational Science, 8, 885-892.
  22. Yu, K., Lu, Z. and Stander, J. (2003). Quantile regression: applications and current research area. The Statistician, 52, 331-350.
  23. Yuan, M. (2006). GACV for quantile smoothing splines. Computational Statistics and Data Analysis, 50, 813-829. https://doi.org/10.1016/j.csda.2004.10.008

Cited by

  1. Deep LS-SVM for regression vol.27, pp.3, 2016, https://doi.org/10.7465/jkdi.2016.27.3.827
  2. Geographically weighted kernel logistic regression for small area proportion estimation vol.27, pp.2, 2016, https://doi.org/10.7465/jkdi.2016.27.2.531
  3. Smoothing parameter selection in semi-supervised learning vol.27, pp.4, 2016, https://doi.org/10.7465/jkdi.2016.27.4.993
  4. Multioutput LS-SVR based residual MCUSUM control chart for autocorrelated process vol.27, pp.2, 2016, https://doi.org/10.7465/jkdi.2016.27.2.523
  5. 심층 다중 커널 최소제곱 서포트 벡터 회귀 기계 vol.29, pp.4, 2015, https://doi.org/10.7465/jkdi.2018.29.4.895