Multiclass Support Vector Machines with SCAD

Title & Authors
Multiclass Support Vector Machines with SCAD
Jung, Kang-Mo;

Abstract
Classification is an important research field in pattern recognition with high-dimensional predictors. The support vector machine(SVM) is a penalized feature selector and classifier. It is based on the hinge loss function, the non-convex penalty function, and the smoothly clipped absolute deviation(SCAD) suggested by Fan and Li (2001). We developed the algorithm for the multiclass SVM with the SCAD penalty function using the local quadratic approximation. For multiclass problems we compared the performance of the SVM with the $\small{L_1}$, $\small{L_2}$ penalty functions and the developed method.
Keywords
Local quadratic approximation;multiclass support vector machine;penalized;smoothly clipped absolute deviation;
Language
English
Cited by
1.
Weighted Support Vector Machines with the SCAD Penalty,;

Communications for Statistical Applications and Methods, 2013. vol.20. 6, pp.481-490
1.
Support Vector Machines for Unbalanced Multicategory Classification, Mathematical Problems in Engineering, 2015, 2015, 1
2.
Weighted Support Vector Machines with the SCAD Penalty, Communications for Statistical Applications and Methods, 2013, 20, 6, 481
References
1.
Bradley, P. S. and Mangasarian, O. L. (1998). Feature selection via concave minimization and support vector machines, In Proceedings of the 13th International Conference on Machine Learning, 82-90

2.
Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression, Annals of Statistics, 32, 407-499.

3.
Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360.

4.
Hastie, T., Tibshirani, R. and Friedman, J. (2001). The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Springer.

5.
Jung, K.-M. (2008). Robust statistical methods in variable selection, Journal of the Korean Data Analysis Society, 10, 3057-3066.

6.
Lee, Y., Lin, Y. and Wahba, G. (2004). Multicategory support vector machines, theory and appli- cations to the classification of microarray data and satellite radiance data, Journal of American Statistical Association, 99, 67-81.

7.
Li, J., Jia, Y. and Li, W. (2011). Adaptive huberized support vector machine and its application to microarray classification, Neural Computing & Applications, 20, 123-132.

8.
Liu, Y., Zhang, H. H., Park, C. and Ahn, J. (2007). Support vector machines with adaptive Lq penalty, Computational Statistics & Data Analysis, 51, 6380-6394.

9.
Tibshirani, R. J. (1996). Regression shrinkage and selection via the LASSO, Journal of the Royal Statistical Society, Series B, 58, 267-288.

10.
Vapnik, V. (1995). The Nature of Statistical Learning Theory, Springer.

11.
Weston, J. and Watkins, C. (1999). Support vector machines for multi-class pattern recognition, In Proceedings of the Seventh European Symposium on Artificial Neural Networks.

12.
Zhang, H. H., Ahn, J., Lin, X. and Park, C. (2006). Gene selection using support vector machines with non-convex penalty, Bioinformatics, 22, 88-95.