• Title/Summary/Keyword: Kernel machines

Search Result 85, Processing Time 0.024 seconds

On Predicting with Kernel Ridge Regression

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.1
    • /
    • pp.103-111
    • /
    • 2003
  • Kernel machines are used widely in real-world regression tasks. Kernel ridge regressions(KRR) and support vector machines(SVM) are typical kernel machines. Here, we focus on two types of KRR. One is inductive KRR. The other is transductive KRR. In this paper, we study how differently they work in the interpolation and extrapolation areas. Furthermore, we study prediction interval estimation method for KRR. This turns out to be a reliable and practical measure of prediction interval and is essential in real-world tasks.

  • PDF

TL-FINITE STATE MACHINES OVER FINITE GROUPS

  • Cho, Sung-Jin
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.3
    • /
    • pp.1009-1019
    • /
    • 2001
  • We introduce the concepts of TL-finite state machine, TL-kernel and TL-subfinite state machines, TL-kernel and TL-subfinite state machine and obtain some results concerning them.

A note on SVM estimators in RKHS for the deconvolution problem

  • Lee, Sungho
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.1
    • /
    • pp.71-83
    • /
    • 2016
  • In this paper we discuss a deconvolution density estimator obtained using the support vector machines (SVM) and Tikhonov's regularization method solving ill-posed problems in reproducing kernel Hilbert space (RKHS). A remarkable property of SVM is that the SVM leads to sparse solutions, but the support vector deconvolution density estimator does not preserve sparsity as well as we expected. Thus, in section 3, we propose another support vector deconvolution estimator (method II) which leads to a very sparse solution. The performance of the deconvolution density estimators based on the support vector method is compared with the classical kernel deconvolution density estimator for important cases of Gaussian and Laplacian measurement error by means of a simulation study. In the case of Gaussian error, the proposed support vector deconvolution estimator shows the same performance as the classical kernel deconvolution density estimator.

Use of Support Vector Machines in Biped Humanoid Robot for Stable Walking (안정적인 보행을 위한 이족 휴머노이드 로봇에서의 서포트 벡터 머신 이용)

  • Kim Dong-Won;Park Gwi-Tae
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.12 no.4
    • /
    • pp.315-319
    • /
    • 2006
  • Support vector machines in biped humanoid robot are presented in this paper. The trajectory of the ZMP in biped walking robot poses an important criterion for the balance of the walking robots but complex dynamics involved make robot control difficult. We are establishing empirical relationships based on the dynamic stability of motion using SVMs. SVMs and kernel method have become very popular method for learning from examples. We applied SVM to model the practical humanoid robot. Three kinds of kernels are employed also and each result has been compared. As a result, SVM based on kernel method have been found to work well. Especially SVM with RBF kernel function provides the best results. The simulation results show that the generated ZMP from the SVM can be improve the stability of the biped walking robot and it can be effectively used to model and control practical biped walking robot.

Expected shortfall estimation using kernel machines

  • Shim, Jooyong;Hwang, Changha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.3
    • /
    • pp.625-636
    • /
    • 2013
  • In this paper we study four kernel machines for estimating expected shortfall, which are constructed through combinations of support vector quantile regression (SVQR), restricted SVQR (RSVQR), least squares support vector machine (LS-SVM) and support vector expectile regression (SVER). These kernel machines have obvious advantages such that they achieve nonlinear model but they do not require the explicit form of nonlinear mapping function. Moreover they need no assumption about the underlying probability distribution of errors. Through numerical studies on two artificial an two real data sets we show their effectiveness on the estimation performance at various confidence levels.

Kernel method for autoregressive data

  • Shim, Joo-Yong;Lee, Jang-Taek
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.5
    • /
    • pp.949-954
    • /
    • 2009
  • The autoregressive process is applied in this paper to kernel regression in order to infer nonlinear models for predicting responses. We propose a kernel method for the autoregressive data which estimates the mean function by kernel machines. We also present the model selection method which employs the cross validation techniques for choosing the hyper-parameters which affect the performance of kernel regression. Artificial and real examples are provided to indicate the usefulness of the proposed method for the estimation of mean function in the presence of autocorrelation between data.

  • PDF

Fine-tuning SVM for Enhancing Speech/Music Classification (SVM의 미세조정을 통한 음성/음악 분류 성능향상)

  • Lim, Chung-Soo;Song, Ji-Hyun;Chang, Joon-Hyuk
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.48 no.2
    • /
    • pp.141-148
    • /
    • 2011
  • Support vector machines have been extensively studied and utilized in pattern recognition area for years. One of interesting applications of this technique is music/speech classification for a standardized codec such as 3GPP2 selectable mode vocoder. In this paper, we propose a novel approach that improves the speech/music classification of support vector machines. While conventional support vector machine optimization techniques apply during training phase, the proposed technique can be adopted in classification phase. In this regard, the proposed approach can be developed and employed in parallel with conventional optimizations, resulting in synergistic boost in classification performance. We first analyze the impact of kernel width parameter on the classifications made by support vector machines. From this analysis, we observe that we can fine-tune outputs of support vector machines with the kernel width parameter. To make the most of this capability, we identify strong correlation among neighboring input frames, and use this correlation information as a guide to adjusting kernel width parameter. According to the experimental results, the proposed algorithm is found to have potential for improving the performance of support vector machines.

COMPARATIVE STUDY OF THE PERFORMANCE OF SUPPORT VECTOR MACHINES WITH VARIOUS KERNELS

  • Nam, Seong-Uk;Kim, Sangil;Kim, HyunMin;Yu, YongBin
    • East Asian mathematical journal
    • /
    • v.37 no.3
    • /
    • pp.333-354
    • /
    • 2021
  • A support vector machine (SVM) is a state-of-the-art machine learning model rooted in structural risk minimization. SVM is underestimated with regards to its application to real world problems because of the difficulties associated with its use. We aim at showing that the performance of SVM highly depends on which kernel function to use. To achieve these, after providing a summary of support vector machines and kernel function, we constructed experiments with various benchmark datasets to compare the performance of various kernel functions. For evaluating the performance of SVM, the F1-score and its Standard Deviation with 10-cross validation was used. Furthermore, we used taylor diagrams to reveal the difference between kernels. Finally, we provided Python codes for all our experiments to enable re-implementation of the experiments.

Modeling properties of self-compacting concrete: support vector machines approach

  • Siddique, Rafat;Aggarwal, Paratibha;Aggarwal, Yogesh;Gupta, S.M.
    • Computers and Concrete
    • /
    • v.5 no.5
    • /
    • pp.461-473
    • /
    • 2008
  • The paper explores the potential of Support Vector Machines (SVM) approach in predicting 28-day compressive strength and slump flow of self-compacting concrete. Total of 80 data collected from the exiting literature were used in present work. To compare the performance of the technique, prediction was also done using a back propagation neural network model. For this data-set, RBF kernel worked well in comparison to polynomial kernel based support vector machines and provide a root mean square error of 4.688 (MPa) (correlation coefficient=0.942) for 28-day compressive strength prediction and a root mean square error of 7.825 cm (correlation coefficient=0.931) for slump flow. Results obtained for RMSE and correlation coefficient suggested a comparable performance by Support Vector Machine approach to neural network approach for both 28-day compressive strength and slump flow prediction.

Kernelized Structure Feature for Discriminating Meaningful Table from Decorative Table (장식 테이블과 의미 있는 테이블 식별을 위한 커널 기반의 구조 자질)

  • Son, Jeong-Woo;Go, Jun-Ho;Park, Seong-Bae;Kim, Kweon-Yang
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.5
    • /
    • pp.618-623
    • /
    • 2011
  • This paper proposes a novel method to discriminate meaningful tables from decorative one using a composite kernel for handling structural information of tables. In this paper, structural information of a table is extracted with two types of parse trees: context tree and table tree. A context tree contains structural information around a table, while a table tree presents structural information within a table. A composite kernel is proposed to efficiently handle these two types of trees based on a parse tree kernel. The support vector machines with the proposed kernel dised kuish meaningful tables from the decorative ones with rich structural information.