DOI QR코드

DOI QR Code

Variable Selection with Nonconcave Penalty Function on Reduced-Rank Regression

  • Received : 2014.08.30
  • Accepted : 2014.12.21
  • Published : 2015.01.31

Abstract

In this article, we propose nonconcave penalties on a reduced-rank regression model to select variables and estimate coefficients simultaneously. We apply HARD (hard thresholding) and SCAD (smoothly clipped absolute deviation) symmetric penalty functions with singularities at the origin, and bounded by a constant to reduce bias. In our simulation study and real data analysis, the new method is compared with an existing variable selection method using $L_1$ penalty that exhibits competitive performance in prediction and variable selection. Instead of using only one type of penalty function, we use two or three penalty functions simultaneously and take advantages of various types of penalty functions together to select relevant predictors and estimation to improve the overall performance of model fitting.

Keywords

References

  1. Antoniadis, A. (1997). Wavelets in statistics: A review (with discussion), Journal of the Italian Statistical Association, 6, 97-144. https://doi.org/10.1007/BF03178905
  2. Breiman, L. (1995). Better subset regression using the nonnegative garrote, Technometrics, 37, 373-384. https://doi.org/10.1080/00401706.1995.10484371
  3. Chen, L. and Huang, J. Z. (2012). Sparse reduced-rand regression for simultaneous dimension reduction and variable selection, Journal of the American Statistical Association, 107, 1533-1545. https://doi.org/10.1080/01621459.2012.734178
  4. Chun, H. and Keles, S. (2010). Sparse partial least squares regression for simultaneous dimension reduction and variable selection, Journal of the Royal Statistical Society, Series B, 72, 3-25. https://doi.org/10.1111/j.1467-9868.2009.00723.x
  5. Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360. https://doi.org/10.1198/016214501753382273
  6. Friedman, J., Hastie, T., Hofling, H. and Tibshirani, R. (2007). Pathwise coordinate optimization, The Annals of Applied Statistics, 1, 302-332. https://doi.org/10.1214/07-AOAS131
  7. Gower, J. C. and Dijksterhuis, G. B. (2004). Procrustes Problems, Oxford University Press, New York.
  8. Izenman, A. J. (1975). Reduced-rank regression for the multivariate linear model, Journal of Multivariate Analysis, 5, 248-264. https://doi.org/10.1016/0047-259X(75)90042-1
  9. Reinsel, G. C. and Velu, R. P. (1998). Multivariate Reduced-Rank Regression: Theory and Applications, Springer, New York.
  10. Tibshirani, R. J. (1996). Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society, Series B, 58, 267-288.
  11. Yee, T. W. and Hastie, T. J. (2003). Reduced-rank vector generailized linear models, Statistical Modeliing, 3, 15-41. https://doi.org/10.1191/1471082X03st045oa
  12. Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society, Series B, 68, 49-67. https://doi.org/10.1111/j.1467-9868.2005.00532.x

Cited by

  1. Model selection algorithm in Gaussian process regression for computer experiments vol.24, pp.4, 2017, https://doi.org/10.5351/CSAM.2017.24.4.383
  2. Penalized rank regression estimator with the smoothly clipped absolute deviation function vol.24, pp.6, 2017, https://doi.org/10.29220/CSAM.2017.24.6.673