• Title/Summary/Keyword: support vector regression

Search Result 522, Processing Time 0.023 seconds

Semi-supervised regression based on support vector machine

  • Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.2
    • /
    • pp.447-454
    • /
    • 2014
  • In many practical machine learning and data mining applications, unlabeled training examples are readily available but labeled ones are fairly expensive to obtain. Therefore semi-supervised learning algorithms have attracted much attentions. However, previous research mainly focuses on classication problems. In this paper, a semi-supervised regression method based on support vector regression (SVR) formulation that is proposed. The estimator is easily obtained via the dual formulation of the optimization problem. The experimental results with simulated and real data suggest superior performance of the our proposed method compared with standard SVR.

Iterative Support Vector Quantile Regression for Censored Data

  • Shim, Joo-Yong;Hong, Dug-Hun;Kim, Dal-Ho;Hwang, Chang-Ha
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.1
    • /
    • pp.195-203
    • /
    • 2007
  • In this paper we propose support vector quantile regression (SVQR) for randomly right censored data. The proposed procedure basically utilizes iterative method based on the empirical distribution functions of the censored times and the sample quantiles of the observed variables, and applies support vector regression for the estimation of the quantile function. Experimental results we then presented to indicate the performance of the proposed procedure.

Semisupervised support vector quantile regression

  • Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.26 no.2
    • /
    • pp.517-524
    • /
    • 2015
  • Unlabeled examples are easier and less expensive to be obtained than labeled examples. In this paper semisupervised approach is used to utilize such examples in an effort to enhance the predictive performance of nonlinear quantile regression problems. We propose a semisupervised quantile regression method named semisupervised support vector quantile regression, which is based on support vector machine. A generalized approximate cross validation method is used to choose the hyper-parameters that affect the performance of estimator. The experimental results confirm the successful performance of the proposed S2SVQR.

Expected shortfall estimation using kernel machines

  • Shim, Jooyong;Hwang, Changha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.3
    • /
    • pp.625-636
    • /
    • 2013
  • In this paper we study four kernel machines for estimating expected shortfall, which are constructed through combinations of support vector quantile regression (SVQR), restricted SVQR (RSVQR), least squares support vector machine (LS-SVM) and support vector expectile regression (SVER). These kernel machines have obvious advantages such that they achieve nonlinear model but they do not require the explicit form of nonlinear mapping function. Moreover they need no assumption about the underlying probability distribution of errors. Through numerical studies on two artificial an two real data sets we show their effectiveness on the estimation performance at various confidence levels.

Generalized Support Vector Quantile Regression (일반화 서포트벡터 분위수회귀에 대한 연구)

  • Lee, Dongju;Choi, Sujin
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.43 no.4
    • /
    • pp.107-115
    • /
    • 2020
  • Support vector regression (SVR) is devised to solve the regression problem by utilizing the excellent predictive power of Support Vector Machine. In particular, the ⲉ-insensitive loss function, which is a loss function often used in SVR, is a function thatdoes not generate penalties if the difference between the actual value and the estimated regression curve is within ⲉ. In most studies, the ⲉ-insensitive loss function is used symmetrically, and it is of interest to determine the value of ⲉ. In SVQR (Support Vector Quantile Regression), the asymmetry of the width of ⲉ and the slope of the penalty was controlled using the parameter p. However, the slope of the penalty is fixed according to the p value that determines the asymmetry of ⲉ. In this study, a new ε-insensitive loss function with p1 and p2 parameters was proposed. A new asymmetric SVR called GSVQR (Generalized Support Vector Quantile Regression) based on the new ε-insensitive loss function can control the asymmetry of the width of ⲉ and the slope of the penalty using the parameters p1 and p2, respectively. Moreover, the figures show that the asymmetry of the width of ⲉ and the slope of the penalty is controlled. Finally, through an experiment on a function, the accuracy of the existing symmetric Soft Margin, asymmetric SVQR, and asymmetric GSVQR was examined, and the characteristics of each were shown through figures.

Mechanical Parameter Identification of Servo Systems using Robust Support Vector Regression (Support Vector Regression을 이용한 서보 시스템의 기계적 상수 추정)

  • Cho Kyung-Rae;Seok Jul-Ki
    • The Transactions of the Korean Institute of Power Electronics
    • /
    • v.10 no.5
    • /
    • pp.468-480
    • /
    • 2005
  • The overall performance of AC servo system is greatly affected the uncertainties of unpredictable mechanical parameter variations and external load disturbances. To overcome this problem, it is necessary to know different parameters and load disturbances subjected to position/speed control. This paper proposes an on-line identification method of mechanical parameters/load disturbances for AC servo system using support vector regression(SVR). The experimental results demonstrate that the proposed SVR algorithm is appropriate for control of unknown servo systems even with time-varying/nonlinear parameters.

Restricted support vector quantile regression without crossing

  • Shim, Joo-Yong;Lee, Jang-Taek
    • Journal of the Korean Data and Information Science Society
    • /
    • v.21 no.6
    • /
    • pp.1319-1325
    • /
    • 2010
  • Quantile regression provides a more complete statistical analysis of the stochastic relationships among random variables. Sometimes quantile functions estimated at different orders can cross each other. We propose a new non-crossing quantile regression method applying support vector median regression to restricted regression quantile, restricted support vector quantile regression. The proposed method provides a satisfying solution to estimating non-crossing quantile functions when multiple quantiles for high dimensional data are needed. We also present the model selection method that employs cross validation techniques for choosing the parameters which aect the performance of the proposed method. One real example and a simulated example are provided to show the usefulness of the proposed method.

Design of controller using Support Vector Regression (서포트 벡터 회귀를 이용한 제어기 설계)

  • Hwang, Ji-Hwan;Kwak, Hwan-Joo;Park, Gwi-Tae
    • Proceedings of the IEEK Conference
    • /
    • 2009.05a
    • /
    • pp.320-322
    • /
    • 2009
  • Support vector learning attracts great interests in the areas of pattern classification, function approximation, and abnormality detection. In this pater, we design the controller using support vector regression which has good properties in comparison with multi-layer perceptron or radial basis function. The applicability of the presented method is illustrated via an example simulation.

  • PDF

A Differential Evolution based Support Vector Clustering (차분진화 기반의 Support Vector Clustering)

  • Jun, Sung-Hae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.5
    • /
    • pp.679-683
    • /
    • 2007
  • Statistical learning theory by Vapnik consists of support vector machine(SVM), support vector regression(SVR), and support vector clustering(SVC) for classification, regression, and clustering respectively. In this algorithms, SVC is good clustering algorithm using support vectors based on Gaussian kernel function. But, similar to SVM and SVR, SVC needs to determine kernel parameters and regularization constant optimally. In general, the parameters have been determined by the arts of researchers and grid search which is demanded computing time heavily. In this paper, we propose a differential evolution based SVC(DESVC) which combines differential evolution into SVC for efficient selection of kernel parameters and regularization constant. To verify improved performance of our DESVC, we make experiments using the data sets from UCI machine learning repository and simulation.

Support Vector Median Regression

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.1
    • /
    • pp.67-74
    • /
    • 2003
  • Median regression analysis has robustness properties which make it an attractive alternative to regression based on the mean. Support vector machine (SVM) is used widely in real-world regression tasks. In this paper, we propose a new SV median regression based on check function. And we illustrate how this proposed SVM performs and compare this with the SVM based on absolute deviation loss function.

  • PDF