DOI QR코드

DOI QR Code

Adaptive lasso in sparse vector autoregressive models

Adaptive lasso를 이용한 희박벡터자기회귀모형에서의 변수 선택

  • Received : 2015.10.27
  • Accepted : 2015.11.30
  • Published : 2016.02.29

Abstract

This paper considers variable selection in the sparse vector autoregressive (sVAR) model where sparsity comes from setting small coefficients to exact zeros. In the estimation perspective, Davis et al. (2015) showed that the lasso type of regularization method is successful because it provides a simultaneous variable selection and parameter estimation even for time series data. However, their simulations study reports that the regular lasso overestimates the number of non-zero coefficients, hence its finite sample performance needs improvements. In this article, we show that the adaptive lasso significantly improves the performance where the adaptive lasso finds the sparsity patterns superior to the regular lasso. Some tuning parameter selections in the adaptive lasso are also discussed from the simulations study.

Keywords

sparse vector autoregressive model;adaptive lasso;high dimensional time series

References

  1. Arnold, A., Liu, Y., and Abe, N. (2008). Temporal causal modeling with graphical Granger methods, In Proceedings of the 13th ACM SIGKDD International Conference of Knowledge Discovery and Data Mining.
  2. Cule, E., De Iorio, M. (2013). Ridge regression in prediction problems: automatic choice of the ridge parameter, Genetic Epidemiology, 37, 704-714. https://doi.org/10.1002/gepi.21750
  3. Identification of synaptic connections in neural ensembles by graphical models, Journal of Neuroscience Methods, 77, 93-107. https://doi.org/10.1016/S0165-0270(97)00100-3
  4. Davis, R. A., Zang, P., and Zheng, T. (2015). Sparse vector autoregressive modeling, arXiv:1207.0520. Econometrica, 37, 424-438.
  5. Hastie, T., Tibshirani, R., Wainwright, M. (2015). Statistical Learning with Sparsity: The Lasso and Generalizations, CRC press.
  6. Huang, J., Ma, S., and Zhang, C.-H. (2008). Adaptive lasso for sparse high-dimensional regression models, Statistica Sincia, 18, 1608-1618.
  7. Hsu, N.-J., Hung, H.-L., and Chang, Y.-M. (2008). Subset selection for vector autoregressive processes using lasso, Computational Statistics & Data Analysis, 52, 3645-3657. https://doi.org/10.1016/j.csda.2007.12.004
  8. Lozano, A. C., Abe, N., Liu, Y., and Rosset, S. (2009). Grouped graphical Granger modeling for gene expression regulatory networks discovery, Bioinformatics, 25, 110-118. https://doi.org/10.1093/bioinformatics/btp199
  9. Lutkepohl, H. (2005). New Introduction to Multiple Time Series Analysis, Springer-Verlag, Berlin.
  10. Song, S. and Bickel, P. J. (2011). Large vector auto regressions, arXiv:1106.3915.
  11. Sims, C. A. (1980). Macroeconomics and reality, Econometrica: Journal of the Econometric Society, 1-48.
  12. Tibshirani, R. (1996). Regression Shrinkage and Selection via the Lasso, Journal of the Royal Statistical Society, Series B, 58, 267-288.
  13. Zhang, J., Jeng, X. J., and Liu, H. (2008). Some Two-Step Procedures for Variable Selection in High-Dimensional Linear Regression, arXiv:0810.1644.
  14. Zou, H. (2006). Adaptive lasso and its oracle properties, Journal of American Statistical Association, 101, 1418-1429. https://doi.org/10.1198/016214506000000735

Acknowledgement

Supported by : 한국연구재단