DOI QR코드

DOI QR Code

Penalized rank regression estimator with the smoothly clipped absolute deviation function

  • Park, Jong-Tae (Department of Data Information, Pyeongtaek University) ;
  • Jung, Kang-Mo (Department of Statistics and Computer Science, Kunsan National University)
  • Received : 2017.09.05
  • Accepted : 2017.10.17
  • Published : 2017.11.30

Abstract

The least absolute shrinkage and selection operator (LASSO) has been a popular regression estimator with simultaneous variable selection. However, LASSO does not have the oracle property and its robust version is needed in the case of heavy-tailed errors or serious outliers. We propose a robust penalized regression estimator which provide a simultaneous variable selection and estimator. It is based on the rank regression and the non-convex penalty function, the smoothly clipped absolute deviation (SCAD) function which has the oracle property. The proposed method combines the robustness of the rank regression and the oracle property of the SCAD penalty. We develop an efficient algorithm to compute the proposed estimator that includes a SCAD estimate based on the local linear approximation and the tuning parameter of the penalty function. Our estimate can be obtained by the least absolute deviation method. We used an optimal tuning parameter based on the Bayesian information criterion and the cross validation method. Numerical simulation shows that the proposed estimator is robust and effective to analyze contaminated data.

Keywords

Acknowledgement

Supported by : National Research Foundation of Korea (NRF)

References

  1. Alfons A, Croux C, and Gelper S (2013). Sparse least trimmed squares regression for analyzing high-dimensional large data sets, The Annals of Applied Statistics, 7, 226-248. https://doi.org/10.1214/12-AOAS575
  2. Chen X, Wang J, and McKeown MJ (2010). Asymptotic analysis of robust LASSOs in the presence of noise with large variance, IEEE Transactions on Information Theory, 56, 5131-5149. https://doi.org/10.1109/TIT.2010.2059770
  3. Fan J and Li R (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360. https://doi.org/10.1198/016214501753382273
  4. Hoerl AE and Kennard RW (1970). Ridge regression: biased estimation for nonorthogonal problems, Technometrics, 12, 55-67. https://doi.org/10.1080/00401706.1970.10488634
  5. Jaeckel LA (1972). Estimating regression coefficients by minimizing the dispersion of the residuals, The Annals of Mathematical Statistics, 43, 1449-1458. https://doi.org/10.1214/aoms/1177692377
  6. Jung KM (2011). Weighted least absolute deviation LASSO estimator, Communications for Statistical Applications and Methods, 18, 733-739. https://doi.org/10.5351/CKSS.2011.18.6.733
  7. Jung KM (2012). Weighted least absolute deviation regression estimator with the SCAD function, Journal of the Korean Data Analysis Society, 14, 2305-2312.
  8. Jung KM (2013). Weighted support vector machines with the SCAD penalty, Communications for Statistical Applications and Methods, 20, 481-490. https://doi.org/10.5351/CSAM.2013.20.6.481
  9. Jung SY and Park C (2015). Variable selection with nonconcave penalty function on reduced-rank regression, Communications for Statistical Applications and Methods, 22, 41-54. https://doi.org/10.5351/CSAM.2015.22.1.041
  10. Kim HJ, Ollila E, and Koivunen V (2015). New robust LASSO method based on ranks. In Proceedings of the 23rd European Signal Processing Conference, Nice, France, 704-708.
  11. Lee S (2015). An additive sparse penalty for variable selection in high-dimensional linear regression model, Communications for Statistical Applications and Methods, 22, 147-157. https://doi.org/10.5351/CSAM.2015.22.2.147
  12. Leng C, Lin Y, andWahba G (2006). A note on the LASSO and related procedures in model selection, Statistica Sinica, 16, 1273-1284.
  13. Rousseeuw PJ and Leroy AM (1987). Robust Regression and Outlier Detection, John Wiley, New York.
  14. Tibshirani R (1996). Regression shrinkage and selection via the LASSO, Journal of the Royal Statistical Society Series B (Methodological), 58, 267-288. https://doi.org/10.1111/j.2517-6161.1996.tb02080.x
  15. Wang H, Li G, and Jiang G (2007). Robust regression shrinkage and consistent variable selection through the LAD-Lasso, Journal of Business & Economic Statistics, 25, 347-355. https://doi.org/10.1198/073500106000000251
  16. Zou H and Li R (2008). One-step sparse estimates in nonconcave penalized likelihood models, Annals of Statistics, 36, 1509-1533. https://doi.org/10.1214/009053607000000802