Advanced SearchSearch Tips
Adaptive lasso in sparse vector autoregressive models
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Adaptive lasso in sparse vector autoregressive models
Lee, Sl Gi; Baek, Changryong;
  PDF(new window)
This paper considers variable selection in the sparse vector autoregressive (sVAR) model where sparsity comes from setting small coefficients to exact zeros. In the estimation perspective, Davis et al. (2015) showed that the lasso type of regularization method is successful because it provides a simultaneous variable selection and parameter estimation even for time series data. However, their simulations study reports that the regular lasso overestimates the number of non-zero coefficients, hence its finite sample performance needs improvements. In this article, we show that the adaptive lasso significantly improves the performance where the adaptive lasso finds the sparsity patterns superior to the regular lasso. Some tuning parameter selections in the adaptive lasso are also discussed from the simulations study.
sparse vector autoregressive model;adaptive lasso;high dimensional time series;
 Cited by
Arnold, A., Liu, Y., and Abe, N. (2008). Temporal causal modeling with graphical Granger methods, In Proceedings of the 13th ACM SIGKDD International Conference of Knowledge Discovery and Data Mining.

Cule, E., De Iorio, M. (2013). Ridge regression in prediction problems: automatic choice of the ridge parameter, Genetic Epidemiology, 37, 704-714. crossref(new window)

Identification of synaptic connections in neural ensembles by graphical models, Journal of Neuroscience Methods, 77, 93-107. crossref(new window)

Davis, R. A., Zang, P., and Zheng, T. (2015). Sparse vector autoregressive modeling, arXiv:1207.0520. Econometrica, 37, 424-438.

Hastie, T., Tibshirani, R., Wainwright, M. (2015). Statistical Learning with Sparsity: The Lasso and Generalizations, CRC press.

Huang, J., Ma, S., and Zhang, C.-H. (2008). Adaptive lasso for sparse high-dimensional regression models, Statistica Sincia, 18, 1608-1618.

Hsu, N.-J., Hung, H.-L., and Chang, Y.-M. (2008). Subset selection for vector autoregressive processes using lasso, Computational Statistics & Data Analysis, 52, 3645-3657. crossref(new window)

Lozano, A. C., Abe, N., Liu, Y., and Rosset, S. (2009). Grouped graphical Granger modeling for gene expression regulatory networks discovery, Bioinformatics, 25, 110-118. crossref(new window)

Lutkepohl, H. (2005). New Introduction to Multiple Time Series Analysis, Springer-Verlag, Berlin.

Song, S. and Bickel, P. J. (2011). Large vector auto regressions, arXiv:1106.3915.

Sims, C. A. (1980). Macroeconomics and reality, Econometrica: Journal of the Econometric Society, 1-48.

Tibshirani, R. (1996). Regression Shrinkage and Selection via the Lasso, Journal of the Royal Statistical Society, Series B, 58, 267-288.

Zhang, J., Jeng, X. J., and Liu, H. (2008). Some Two-Step Procedures for Variable Selection in High-Dimensional Linear Regression, arXiv:0810.1644.

Zou, H. (2006). Adaptive lasso and its oracle properties, Journal of American Statistical Association, 101, 1418-1429. crossref(new window)