Advanced SearchSearch Tips
Multiclass Support Vector Machines with SCAD
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Multiclass Support Vector Machines with SCAD
Jung, Kang-Mo;
  PDF(new window)
Classification is an important research field in pattern recognition with high-dimensional predictors. The support vector machine(SVM) is a penalized feature selector and classifier. It is based on the hinge loss function, the non-convex penalty function, and the smoothly clipped absolute deviation(SCAD) suggested by Fan and Li (2001). We developed the algorithm for the multiclass SVM with the SCAD penalty function using the local quadratic approximation. For multiclass problems we compared the performance of the SVM with the , penalty functions and the developed method.
Local quadratic approximation;multiclass support vector machine;penalized;smoothly clipped absolute deviation;
 Cited by
Weighted Support Vector Machines with the SCAD Penalty,Jung, Kang-Mo;

Communications for Statistical Applications and Methods, 2013. vol.20. 6, pp.481-490 crossref(new window)
Support Vector Machines for Unbalanced Multicategory Classification, Mathematical Problems in Engineering, 2015, 2015, 1  crossref(new windwow)
Weighted Support Vector Machines with the SCAD Penalty, Communications for Statistical Applications and Methods, 2013, 20, 6, 481  crossref(new windwow)
Bradley, P. S. and Mangasarian, O. L. (1998). Feature selection via concave minimization and support vector machines, In Proceedings of the 13th International Conference on Machine Learning, 82-90

Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression, Annals of Statistics, 32, 407-499. crossref(new window)

Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360. crossref(new window)

Hastie, T., Tibshirani, R. and Friedman, J. (2001). The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Springer.

Jung, K.-M. (2008). Robust statistical methods in variable selection, Journal of the Korean Data Analysis Society, 10, 3057-3066.

Lee, Y., Lin, Y. and Wahba, G. (2004). Multicategory support vector machines, theory and appli- cations to the classification of microarray data and satellite radiance data, Journal of American Statistical Association, 99, 67-81. crossref(new window)

Li, J., Jia, Y. and Li, W. (2011). Adaptive huberized support vector machine and its application to microarray classification, Neural Computing & Applications, 20, 123-132. crossref(new window)

Liu, Y., Zhang, H. H., Park, C. and Ahn, J. (2007). Support vector machines with adaptive Lq penalty, Computational Statistics & Data Analysis, 51, 6380-6394. crossref(new window)

Tibshirani, R. J. (1996). Regression shrinkage and selection via the LASSO, Journal of the Royal Statistical Society, Series B, 58, 267-288.

Vapnik, V. (1995). The Nature of Statistical Learning Theory, Springer.

Weston, J. and Watkins, C. (1999). Support vector machines for multi-class pattern recognition, In Proceedings of the Seventh European Symposium on Artificial Neural Networks.

Zhang, H. H., Ahn, J., Lin, X. and Park, C. (2006). Gene selection using support vector machines with non-convex penalty, Bioinformatics, 22, 88-95. crossref(new window)