DOI QR코드

DOI QR Code

A study on bias effect of LASSO regression for model selection criteria

모형 선택 기준들에 대한 LASSO 회귀 모형 편의의 영향 연구

Yu, Donghyeon
유동현

  • Received : 2016.03.02
  • Accepted : 2016.04.28
  • Published : 2016.06.30

Abstract

High dimensional data are frequently encountered in various fields where the number of variables is greater than the number of samples. It is usually necessary to select variables to estimate regression coefficients and avoid overfitting in high dimensional data. A penalized regression model simultaneously obtains variable selection and estimation of coefficients which makes them frequently used for high dimensional data. However, the penalized regression model also needs to select the optimal model by choosing a tuning parameter based on the model selection criterion. This study deals with the bias effect of LASSO regression for model selection criteria. We numerically describes the bias effect to the model selection criteria and apply the proposed correction to the identification of biomarkers for lung cancer based on gene expression data.

Keywords

LASSO;penalized regression;bias;model selection;information criterion

References

  1. Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In Proceeding of the Second International Symposium on Information Theory, Budapest: Akademiai Kiado.
  2. Andersen, P. and Gill, R. (1982). Cox's regression model for counting processes, a large sample study, Annals of Statistics, 10, 1100-1120. https://doi.org/10.1214/aos/1176345976
  3. Benjamini, Y. and Hochberg, Y. (1995). Controlling the false discovery rate: a practical and powerful approach to multiple testing, Journal of the Royal Statistical Society. Series B (Methodological), 57, 289-300.
  4. Belloni, A. and Chernozhukov, V. (2013). Least squares after model selection in high-dimensional sparse models, Bernoulli, 19, 521-547. https://doi.org/10.3150/11-BEJ410
  5. Cantero-Recasens, G., Fandos, C., Rubio-Moscardo, F., Valverde, M. A., and Vicente, R. (2010). The asthma-associated ORMDL3 gene product regulates endoplasmic reticulum-mediated calcium signaling and cellular stress, Human Molecular Genetics, 19, 111-121. https://doi.org/10.1093/hmg/ddp471
  6. Danaher, P., Wang, P., and Witten, D. M. (2014). The joint graphical lasso for inverse covariance estimation across multiple classes, Journal of the Royal Statistical Society. Series B (Methodological), 76, 373-397. https://doi.org/10.1111/rssb.12033
  7. Demmel, J. W. (1997). Applied Numerical Linear Algebra, SIAM, Philadelphia.
  8. Efron, B., Hastie, T., Johnstone, I., and Tibshirani, R. (2004). Least angle regression, Annals of Statistics, 32, 407-499. https://doi.org/10.1214/009053604000000067
  9. El-Telbany, A. and Ma, P. C. (2012). Cancer genes in lung cancer: racial disparities: are there any?, Genes & Cancer, 3, 467-480. https://doi.org/10.1177/1947601912465177
  10. Fan, J. and Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, 96, 1348-1360. https://doi.org/10.1198/016214501753382273
  11. Fan, Y. and Tang, C. Y. (2013). Tuning parameter selection in high dimensional penalized likelihood, Journal of the Royal Statistical Society. Series B (Methodological), 75, 531-552. https://doi.org/10.1111/rssb.12001
  12. Friedman, J., Hastie, T., Ho ing, H., and Tibshirani, R. (2007). Pathwise coordinate optimization, Annals of Applied Statistics, 1, 302-332. https://doi.org/10.1214/07-AOAS131
  13. Friedman, J., Hastie, T., and Tibshirani, R. (2008). Sparse inverse covariance estimation with the graphical lasso, Biostatistics, 9, 432-441. https://doi.org/10.1093/biostatistics/kxm045
  14. Irizarry, R. A., Bolstad, B. M., Collin, F., Cope, L. M., Hobbs, B., and Speed, T. P. (2003). Summaries of Affymetrix GeneChip probe level data, Nucleic Acids Research, 31, e15. https://doi.org/10.1093/nar/gng015
  15. Jemal, A., Siegel, R., Xu, J., and Ward, E. (2010). Cancer statistics, CA: A Cancer Journal for Clinicians, 60, 277-300. https://doi.org/10.3322/caac.20073
  16. Khare, K., Oh, S.-Y., and Rajaratnam, B. (2015). A convex pseudolikelihood framework for high dimensional partial correlation estimation with convergence guarantees, Journal of the Royal Statistical Society. Series B (Methodological), 77, 803-825. https://doi.org/10.1111/rssb.12088
  17. Meinshausen, N. and Buhlmann, P. (2006). High-dimensional graphs and variable selection with the lasso, Annals of Statistics, 34, 1436-1462. https://doi.org/10.1214/009053606000000281
  18. Nishii, R. (1984). Asymptotic properties of criteria for selection of variables in multiple regression, Annals of Statistics, 12, 758-765. https://doi.org/10.1214/aos/1176346522
  19. Paige, C. C. and Saunders, M. A. (1975). Solution of sparse indefinite systems of linear equations, SIAM Journal on Numerical Analysis, 12, 617-629. https://doi.org/10.1137/0712047
  20. Peng, J., Wang, P., Zhou, N., and Zhu, J. (2009). Partial correlation estimation by Joint sparse regression models, Journal of the American Statistical Association, 104, 735-746. https://doi.org/10.1198/jasa.2009.0126
  21. Picard, R. R. and Cook, R. D. (1984). Cross-validation of regression models, Journal of the American Statistical Association, 79, 575-583. https://doi.org/10.1080/01621459.1984.10478083
  22. Pounds, S. and Morris, S. W. (2003). Estimating the occurrence of false positives and false negatives in microarray studies by approximating and partitioning the empirical distribution of p-values, Bioinformatics, 19, 1236-1242. https://doi.org/10.1093/bioinformatics/btg148
  23. Schwarz, G. (1978). Estimating the dimension of a model, Annals of Statistics, 6, 461-464. https://doi.org/10.1214/aos/1176344136
  24. Shao, J. (1997). An asymptotic theory for linear model selection, Statistica Sinica, 7, 221-264.
  25. Tang, H., Xiao, G., Behrens, C., Schiller, J., Allen, J., Chow, C. W., Suraokar, M., Corvalan, A., Mao, J., White, M. A., Wistuba, I., Minna, J. D., and Xie, Y. (2013). A 12-gene set predicts survival benefits from adjuvant chemotherapy in non-small cell lung cancer patients, Clinical Cancer Research, 19, 1577-1586. https://doi.org/10.1158/1078-0432.CCR-12-2321
  26. Tibshirani, R. (1996). Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society. Series B (Methodological), 58, 267-288.
  27. Tibshirani, R. J. (2013). The lasso problem and uniqueness, Electronic Journal of Statistics, 7, 1456-1490. https://doi.org/10.1214/13-EJS815
  28. Tomida, S., Takeuchi, T., Shimada, Y., Arima, C., Matsuo, K., Mitsudomi, T., Yatabe, Y., and Takahashi, T. (2009). Relapse-related molecular signature in lung adenocarcinomas identifies patients with dismal prognosis, Journal of Clinical Oncology, 27, 2793-2799. https://doi.org/10.1200/JCO.2008.19.7053
  29. Wang, H., Li, B., and Leng, C. (2009). Shrinkage tuning parameter selection with a diverging number of parameters, Journal of the Royal Statistical Society. Series B (Methodological), 71, 671-683. https://doi.org/10.1111/j.1467-9868.2008.00693.x
  30. Wang, H., Li, R., and Tsai, C. L. (2007). Tuning parameter selectors for the smoothly clipped absolute deviation method, Biometrika, 94, 553-568. https://doi.org/10.1093/biomet/asm053
  31. Wang, T. and Zhu, L. (2011). Consistent tuning parameter selection in high dimensional sparse linear regression, Journal of Multivariate Analysis, 102, 1141-1151. https://doi.org/10.1016/j.jmva.2011.03.007
  32. Yang, Y. (2005). Can the strengths of aic and bic be shared?: a conflict between model identification and regression estimation, Biometrika, 92, 937-950. https://doi.org/10.1093/biomet/92.4.937
  33. Yoo, J., Lee, S.-H., Lym, K., Park, S. Y., Yang, S.-H., Yoo, C.-Y., Jung, J.-H., Kang, S.-J., and Kang, C.-S. (2012). Immunohistochemical expression of DCUN1D1 in non-small cell lung carcinoma: its relation to brain metastasis, Cancer Research and Treatment: Official Journal of Korean Cancer Association, 44, 57-62. https://doi.org/10.4143/crt.2012.44.1.57
  34. Yu, D., Son, W., Lim, J., and Xiao, G. (2015). Statistical completion of partially identified graph with application to estimation of gene regulatory network, Biostatistics, 16, 670-685. https://doi.org/10.1093/biostatistics/kxv013
  35. Zhang, C.-H. (2010). Nearly unbiased variable selection under minimax concave penalty, Annals of Statistics, 38, 894-942. https://doi.org/10.1214/09-AOS729
  36. Zhang, C.-H. and Huang, J. (2008). The sparsity and bias of the lasso selection in high-dimensional linear regression, Annals of Statistics, 36, 1567-1594. https://doi.org/10.1214/07-AOS520
  37. Zou, H. (2006). The adaptive lasso and its oracle properties, Journal of the American Statistical Association, 101, 1418-1429. https://doi.org/10.1198/016214506000000735
  38. Sterrenberg, J. N., Blatch, G. L., and Edkins, A. L. (2011). Human DNAJ in cancer and stem cells, Cancer Letters, 312, 129-142. https://doi.org/10.1016/j.canlet.2011.08.019

Acknowledgement

Supported by : 계명대학교