Advanced SearchSearch Tips
A note on SVM estimators in RKHS for the deconvolution problem
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
A note on SVM estimators in RKHS for the deconvolution problem
Lee, Sungho;
  PDF(new window)
In this paper we discuss a deconvolution density estimator obtained using the support vector machines (SVM) and Tikhonov's regularization method solving ill-posed problems in reproducing kernel Hilbert space (RKHS). A remarkable property of SVM is that the SVM leads to sparse solutions, but the support vector deconvolution density estimator does not preserve sparsity as well as we expected. Thus, in section 3, we propose another support vector deconvolution estimator (method II) which leads to a very sparse solution. The performance of the deconvolution density estimators based on the support vector method is compared with the classical kernel deconvolution density estimator for important cases of Gaussian and Laplacian measurement error by means of a simulation study. In the case of Gaussian error, the proposed support vector deconvolution estimator shows the same performance as the classical kernel deconvolution density estimator.
deconvolution;ill-posed problem;kernel density estimator;regularization;reproducing kernel Hilbert space (RKHS);support vector machines (SVM);
 Cited by
Aronszajn N (1950). Theory of reproducing kernels, Transactions of the American Mathematical Society, 68, 337-404. crossref(new window)

Bochner S (1959). Lectures on Fourier Integral, Princeton University Press, Princeton, New Jersey.

Carroll RJ and Hall P (1988). Optimal rates of convergence for deconvoluting a density, Journal of the American Statistical Association, 83, 1184-1886. crossref(new window)

Fan J (1991). On the optimal rates of convergence for nonparametric deconvolution problem, Annals of Statistics, 19, 1257-1272. crossref(new window)

Fan J (1992). Deconvolution with supersmooth distribution, Canadian Journal of Statistics, 20, 159-169.

Girosi F (1998). An equivalence between sparse approximation and support vector machines, Neural Computation, 10, 1455-1480. crossref(new window)

Gunn SR (1998). Support vector machines for classification and regression, Technical report, University of Southampton.

Hall P and Qiu P (2005). Discrete-transform approach to deconvolution problems, Biometrika, 92, 135-148. crossref(new window)

Hazelton ML and Turlach BA (2009). Nonparametric density deconvolution by weighted kernel estimators, Statistics and Computing, 19, 217-228. crossref(new window)

Lee S (2010). A support vector method for the deconvolution problem, Communications of the Korean Statistical Society, 17, 451-457. crossref(new window)

Lee S (2012). A note on deconvolution estimators when measurements errors are normal, Communications of the Korean Statistical Society, 19, 517-526. crossref(new window)

Lee S and Taylor RL (2008). A note on support vector density estimation for the deconvolution problem, Communications in Statistics: Theory and Methods, 37, 328-336.

Mendelsohn J and Rice R (1982). Deconvolution of microfluorometric histograms with B splines, Journal of the American Statistical Association, 77, 748-753.

Mercer J (1909). Functions of positive and negative type and their connection with the theory of integral equations, Philosophical Transactions of the Royal Society of London, A 209, 415-446. crossref(new window)

Moguerza JM and Munoz A (2006). Support vector machines with applications, Statistical Science, 21, 322-336. crossref(new window)

Mukherjee S and Vapnik V (1999). Support vector method for multivariate density estimation, In, Proceedings in Neural Information Processing Systems, 659-665.

Pensky M and Vidakovic B (1999). Adaptive wavelet estimator for nonparametric density deconvolutoin, Annals of Statistics, 27, 2033-2053. crossref(new window)

Phillips DL (1962). A technique for the numerical solution of integral equations of the first kind, Journal of the Association for Computing Machinery, 9, 84-97. crossref(new window)

Rasmussen CE and Williams CKI (2006). Gaussian Processes for Machine Learning, MIT Press, Cambridge, MA.

Scholkopf B, Herbrich R, and Smola AJ (2001). A generalized representer theorem, Computational Learning Theory, Lecture Notes in Computer Science, 2111, 416-426.

Scholkopf B and Smola AJ (2002). Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond, MIT Press, Cambridge, MA.

Smola AJ and Scholkopf B (2003). A tutorial on support vector regression, Statistics and Computing, 14, 199-222.

Stefanski L and Carroll RJ (1990). Deconvoluting kernel density estimators, Statistics, 21, 169-184. crossref(new window)

Tikhonov AN and Arsenin VY (1977). Solution of Ill-posed Problems, W. H. Winston, Washington.

Vapnik V (1995). The Nature of Statistical Learning Theory, Springer Verlag, New York.

Vapnik V and Chervonenkis A (1964). A note on one class of perceptrons, Automation and Remote Control, 25, 103-109.

Vapnik V and Lerner L (1963). Pattern recognition using generalized portrait method, Automation and Remote Control, 24, 774-780.

Vert R and Vert J (2006). Consistency and convergence rates of one-class SVMs and related algo-rithms, Journal of Machine Learning Research, 7, 817-854.

Wahba G (1990). Spline Models for Observational Data, CBMS-NSF Regional Conference Series in Applied Mathematics, 59, SIAM, Philadelphia.

Wahba G (2006). Comments to support vector machines with applications by J.M. Moguerza and A. Munoz, Statistical Sciences, 21, 347-351. crossref(new window)

Wang X andWang B (2011). Deconvolution estimation in measurement error models: The R package decon, Journal of Statistical Software, 39, i10.

Weston J, Gammerman A, Stitson M, Vapnik V, Vovk V, andWatkins C (1999). Support vector density estimation. In Scholkopf, B. and Smola, A., editors, Advances in Kernel Methods-Support Vector Learning, 293-306, MIT Press, Cambridge, MA.

Zhang HP (1992). On deconvolution using time of flight information in positron emission tomography, Statistica Sinica, 2, 553-575.