Sufficient dimension reduction methods often require stringent conditions on the joint distribution of the predictor, or, when such conditions are not satisfied, rely on marginal transformation or reweighting to fulfill them approximately. For example, a typical dimension reduction method would require the predictor to have elliptical or even multivariate normal distribution. In this paper, we reformulate the commonly used dimension reduction methods, via the notion of "central solution space," so as to circumvent the requirements of such strong assumptions, while at the same time preserve the desirable properties of the classical methods, such as √ n-consistency and asymptotic normality. Imposing elliptical distributions or even stronger assumptions on predictors is often considered as the necessary tradeoff for overcoming the "curse of dimensionality," but the development of this paper shows that this need not be the case. The new methods will be compared with existing methods by simulation and applied to a data set.
CITATION STYLE
Li, B., & Dong, Y. (2009). Dimension reduction for nonelliptically distributed predictors. Annals of Statistics, 37(3), 1272–1298. https://doi.org/10.1214/08-AOS598
Mendeley helps you to discover research relevant for your work.