JOURNAL BROWSE
Search
Advanced SearchSearch Tips
Variable Selection in Sliced Inverse Regression Using Generalized Eigenvalue Problem with Penalties
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Variable Selection in Sliced Inverse Regression Using Generalized Eigenvalue Problem with Penalties
Park, Chong-Sun;
  PDF(new window)
 Abstract
Variable selection algorithm for Sliced Inverse Regression using penalty function is proposed. We noted SIR models can be expressed as generalized eigenvalue decompositions and incorporated penalty functions on them. We found from small simulation that the HARD penalty function seems to be the best in preserving original directions compared with other well-known penalty functions. Also it turned out to be effective in forcing coefficient estimates zero for irrelevant predictors in regression analysis. Results from illustrative examples of simulated and real data sets will be provided.
 Keywords
Sliced inverse regression;variable selection;penalty functions;simulated annealing;
 Language
English
 Cited by
1.
공분산분석 모형에서의 변수선택 정리,윤상후;박정수;

Communications for Statistical Applications and Methods, 2008. vol.15. 3, pp.333-342 crossref(new window)
 References
1.
Aarts, E. and Korst, J. (1989). Simulated Annealing and Boltzmasui Machines. John Wiley & Sons

2.
Breiman, L. (1995). Better subset regression using the nonnegative garrote. Technometrics, 37, 373-384 crossref(new window)

3.
Cadima, J. and Jolliffe, I. T. (1995). Loadings and correlations in the interpretation of principal components. Journal of Applied Statistics, 22, 203-214 crossref(new window)

4.
Fan, J. and Li, R. (2001). Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties. Journal of The American Statistical Association, 96, 1348-1360 crossref(new window)

5.
Hausman, R. E. Jr. (1982). Constrained multivariate analysis. Optimisation in Statistics (Zanckis, S. H. and Rustagi, J. S., eds.), 137-151, North-Holland: Amsterdam

6.
Jolliffe, I. T. (1972). Discarding variables in a principal component analysis. I: artificial data. Applied Statistics, 21, 160-173 crossref(new window)

7.
Jolliffe, I. T. (1973). Discarding variables in a principal component analysis. ii: real data. Applied Statistics, 22, 21-31 crossref(new window)

8.
Jolliffe, I. T. (1989). Rotation of Ill-defined principal components. Applied Statistics, 38, 139-147 crossref(new window)

9.
Jolliffe, I. T. (1995). Rotation of principal components: choice of normalization constraints. Journal of Applied Statistics, 22, 29-35 crossref(new window)

10.
Jolliffe, I. T., Trendafilov, N. T. and Uddin, M. (2003). A modified principal component technique based on the Lasso. Journal of Computational and Graphical Statistics, 12, 531-547 crossref(new window)

11.
Kirkpatrick, S., Gelatt, C. D. Jr. and Vecchi, M. P. (1983). Optimization by simulated annealing. Science, 220, 671-680 crossref(new window)

12.
Li, K. C. (1991). Sliced inverse regression for dimension reduction. Journal of The American Statistical Association, 86, 316-342 crossref(new window)

13.
Li, K. C. (2000). High dimensional data analysis via the SIR/PHD approach. unpublished manuscript

14.
McCabe, G. P. (1984). Principal variables. Technometrics, 26, 137-144 crossref(new window)

15.
Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society, Ser. B, 58, 267-288

16.
Vines, S. K. (2000). Simple principal components. Applied Statistics, 49, 441-451