Advanced SearchSearch Tips
Tutorial: Dimension reduction in regression with a notion of sufficiency
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
Tutorial: Dimension reduction in regression with a notion of sufficiency
Yoo, Jae Keun;
  PDF(new window)
In the paper, we discuss dimension reduction of predictors in a regression of with a notion of sufficiency that is called sufficient dimension reduction. In sufficient dimension reduction, the original predictors are replaced by its lower-dimensional linear projection without loss of information on selected aspects of the conditional distribution. Depending on the aspects, the central subspace, the central mean subspace and the central -moment subspace are defined and investigated as primary interests. Then the relationships among the three subspaces and the changes in the three subspaces for non-singular transformation of are studied. We discuss the two conditions to guarantee the existence of the three subspaces that constrain the marginal distribution of and the conditional distribution of . A general approach to estimate them is also introduced along with an explanation for conditions commonly assumed in most sufficient dimension reduction methodologies.
central subspace;central -moment subspace;central mean subspace;dimension reduction subspace;regression;sufficient dimension reduction;
 Cited by
Dimension reduction for right-censored survival regression: transformation approach, Communications for Statistical Applications and Methods, 2016, 23, 3, 259  crossref(new windwow)
Tutorial: Methodologies for sufficient dimension reduction in regression, Communications for Statistical Applications and Methods, 2016, 23, 2, 105  crossref(new windwow)
Bickel P (1982). On adaptive estimation, Annals of Statistics, 10, 647-671. crossref(new window)

Cook RD (1998). Regression Graphics: Ideas for Studying Regressions through Graphics, Wiley, New York.

Cook RD (2000). Save: a method for dimension reduction and graphics in regression, Communications in Statistics-Theory and Methods, 29, 2109-2121. crossref(new window)

Cook RD and Li B (2002). Dimension reduction for conditional mean in regression, Annals of Statistics, 30, 455-474. crossref(new window)

Cook RD, Li B, and Chiaromonte F (2007). Dimension reduction in regression without matrix inversion, Biometrika, 94, 569-584. crossref(new window)

Cook RD and Weisberg S (1991). Comment: Sliced inverse regression for dimension reduction by KC Li, Journal of the American Statistical Association, 86, 328-332.

Eaton ML (1986). A characterization of spherical distributions, Journal of Multivariate Analysis, 20, 272-276. crossref(new window)

Hall P and Li KC (1993). On almost linearity of low dimensional projections from high dimensional data, Annals of Statistics, 21, 867-889. crossref(new window)

Li KC (1991). Sliced inverse regression for dimension reduction, Journal of the American Statistical Association, 86, 316-327. crossref(new window)

Li KC (1992). On principal Hessian directions for data visualization and dimension reduction: another application of Stein's lemma, Journal of the American Statistical Association, 87, 1025-1039. crossref(new window)

Li L, Cook RD, and Nachtsheim CJ (2004). Cluster-based estimation for sufficient dimension reduction, Computational Statistics & Data Analysis, 47, 175-193. crossref(new window)

Rao CR (1965). Linear Statistical Inference and Its Application, Wiley, New York.

Shao Y, Cook RD, and Weisberg S (2006). The linearity condition and adaptive estimation in single-index regressions, Retrieved March 1, 2016, from:

Yin X and Cook RD (2002). Dimension reduction for the conditional kth moment in regression, Journal of the Royal Statistical Society Series B, 64, 159-175. crossref(new window)