DOI QR코드

DOI QR Code

Robust varying coefficient model using L1 regularization

  • Hwang, Changha (Department of Applied Statistics, Dankook University) ;
  • Bae, Jongsik (Department of Mathematics, Sungkyunkwan University) ;
  • Shim, Jooyong (Institute of Statistical Information, Department of Statistics, Inje University)
  • Received : 2016.05.24
  • Accepted : 2016.06.23
  • Published : 2016.07.31

Abstract

In this paper we propose a robust version of varying coefficient models, which is based on the regularized regression with L1 regularization. We use the iteratively reweighted least squares procedure to solve L1 regularized objective function of varying coefficient model in locally weighted regression form. It provides the efficient computation of coefficient function estimates and the variable selection for given value of smoothing variable. We present the generalized cross validation function and Akaike information type criterion for the model selection. Applications of the proposed model are illustrated through the artificial examples and the real example of predicting the effect of the input variables and the smoothing variable on the output.

Keywords

References

  1. Akaike, H. (1974). A new look at the statistical model identification. IEEE Transaction on Automatic Control, 19, 716-723. https://doi.org/10.1109/TAC.1974.1100705
  2. Cho, D. H., Shim, J. and Seok, K. (2010). Doubly regularized kernel method for heteroscedastic autoregressive data. Journal of the Korean Data & Information Science Society, 21, 155-162.
  3. Cleveland, W. S. and Devlin, S. J. (1988). Locally-weighted regression: An approach to regression analysis by local fitting. Journal of the American Statistical Associationl, 83, 596-610. https://doi.org/10.1080/01621459.1988.10478639
  4. Fan, J. and I. Gijbels (1996). Local polynomial modelling and its applications, Chapman & Hall, Boca Raton.
  5. Fan, J. and Zhang, W. (2008). Statistical methods with varying coefficient models. Statistics and its Inter-face, 1, 179-195. https://doi.org/10.4310/SII.2008.v1.n1.a15
  6. Hastie, T. and Tibshirani, R. (1993). Varying-coefficient models. Journal of the Royal Statistical Society B, 55, 757-796.
  7. Hoover, D. R., Rice, J. A., Wu, C. O. and Yang, L. P. (1998). Nonparametric smoothing estimates of time-varying coefficient models with longitudinal data. Biometrika, 85, 809-822. https://doi.org/10.1093/biomet/85.4.809
  8. Hu, S. and Rao, J. S. (2010). Sparse penalization with censoring constraints for estimating high dimensional AFT models with applications to microarray data analysis, Technical Report 07 of Division of Biostatistics, Case Western Reserve University, Ohio.
  9. Huang, J., Ma, S. and Xie, H. (2005). Regularized estimation in the accelerated failure time model with high dimensional covariates, Technical Report No. 349, Department of Statistics and Actuarial Science, The University of Iowa, Iowa.
  10. Hwang, C., Kim, M. and Shim, J. (2011). Variable selection in L1 penalized censored Regression. Journal of the Korean Data & Information Science Society, 22, 951-959.
  11. Krishnapuram, B., Carlin, L., Figueiredo, M. A. T. and Hartermink, A. J. (2005). Sparse multinomial logistic regression: Fast algorithms and generalization bounds. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27, 957-968. https://doi.org/10.1109/TPAMI.2005.127
  12. Park, B. U., Mammen, E., Lee, Y. K. and Lee, E. R. (2015). Varying coefficient regression models: A review and new developments. International Statistical Review, 83, 36-64. https://doi.org/10.1111/insr.12029
  13. Sauerbrei, W. and Schumacher, M. (1992). A bootstrap resampling procedure for model building: Application to the Cox regression model. Statistical Medicine, 11, 2093-2099. https://doi.org/10.1002/sim.4780111607
  14. Shim, J. and Hwang, C. (2015). Varying coefficient modeling via least squares support vector regression. Neurocomputing, 161, 254-259. https://doi.org/10.1016/j.neucom.2015.02.036
  15. Shim, J., Kim, M. and Seok, K. (2015). SVQR with asymmetric quadratic function. Journal of the Korean Data & Information Science Society, 26, 1537-1545. https://doi.org/10.7465/jkdi.2015.26.6.1537
  16. Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society B, 58, 267-288.
  17. Wang, H. and Xia, Y. (2009). Shrinkage Estimation of the varying coefficient model. Journal of the American Statistical Association, 104, 747-757. https://doi.org/10.1198/jasa.2009.0138
  18. Williams, P. M. (1995). Bayesian regularization and pruning using a Laplace prior. Neural Computation, 7, 117-143. https://doi.org/10.1162/neco.1995.7.1.117
  19. Wooldridge, J. M. (2012).Introductory econometrics: A modern approach, South-Western Cengage Learning, Mason.
  20. Wu, C., Shi, X., Cui, Y. and Ma, S. (2015). A penalized robust semiparametric approach for gene-environment interactions. Statistics in Medicine, doi:10.1002/sim.6609.

Cited by

  1. Geographically weighted least squares-support vector machine vol.28, pp.1, 2016, https://doi.org/10.7465/jkdi.2017.28.1.227
  2. Feature selection in the semivarying coefficient LS-SVR vol.28, pp.2, 2016, https://doi.org/10.7465/jkdi.2017.28.2.461
  3. A study on a composite support vector quantile regression with varying coefficient model vol.29, pp.4, 2016, https://doi.org/10.7465/jkdi.2018.29.4.1077