SLICED INVERSE REGRESSION FOR DIMENSION REDUCTION

被引:1484
|
作者
LI, KC
机构
关键词
DYNAMIC GRAPHICS; PRINCIPAL COMPONENT ANALYSIS; PROJECTION PURSUIT;
D O I
10.2307/2290563
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Modern advances in computing power have greatly widened scientists' scope in gathering and investigating information from many variables, information which might have been ignored in the past. Yet to effectively scan a large pool of variables is not an easy task, although our ability to interact with data has been much enhanced by recent innovations in dynamic graphics. In this article, we propose a novel data-analytic tool, sliced inverse regression (SIR), for reducing the dimension of the input variable x without going through any parametric or nonparametric model-fitting process. This method explores the simplicity of the inverse view of regression; that is, instead of regressing the univariate output variable y against the multivariate X, we regress x against y. Forward regression and inverse regression are connected by a theorem that motivates this method. The theoretical properties of SIR are investigated under a model of the form, y = f(beta-1x, ..., beta(K)x, epsilon), where the beta-k's are the unknown row vectors. This model looks like a nonlinear regression, except for the crucial difference that the functional form of f is completely unknown. For effectively reducing the dimension, we need only to estimate the space [effective dimension reduction (e.d.r.) space] generated by the beta-k's. This makes our goal different from the usual one in regression analysis, the estimation of all the regression coefficients. In fact, the beta-k's themselves are not identifiable without a specific structural form on f. Our main theorem shows that under a suitable condition, if the distribution of x has been standardized to have the zero mean and the identity covariance, the inverse regression curve, E(x \ y), will fall into the e.d.r. space. Hence a principal component analysis on the covariance matrix for the estimated inverse regression curve can be conducted to locate its main orientation, yielding our estimates for e.d.r. directions. Furthermore, we use a simple step function to estimate the inverse regression curve. No complicated smoothing is needed. SIR can be easily implemented on personal computers. By simulation, we demonstrate how SIR can effectively reduce the dimension of the input variable from, say, 10 to K = 2 for a data set with 400 observations. The spin-plot of y against the two projected variables obtained by SIR is found to mimic the spin-plot of y against the true directions very well. A chi-squared statistic is proposed to address the issue of whether or not a direction found by SIR is spurious.
引用
收藏
页码:316 / 327
页数:12
相关论文
共 50 条
  • [1] Overlapping sliced inverse regression for dimension reduction
    Zhang, Ning
    Yu, Zhou
    Wu, Qiang
    ANALYSIS AND APPLICATIONS, 2019, 17 (05) : 715 - 736
  • [3] Nonlinear Dimension Reduction with Kernel Sliced Inverse Regression
    Yeh, Yi-Ren
    Huang, Su-Yun
    Lee, Yuh-Jye
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2009, 21 (11) : 1590 - 1603
  • [4] Online sufficient dimension reduction through sliced inverse regression
    Cai, Zhanrui
    Li, Runze
    Zhu, Liping
    Journal of Machine Learning Research, 2020, 21
  • [5] Sliced Inverse Regression With Adaptive Spectral Sparsity for Dimension Reduction
    Xu, Xiao-Lin
    Ren, Chuan-Xian
    Wu, Ran-Chao
    Yan, Hong
    IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (03) : 759 - 771
  • [6] Robust dimension reduction using sliced inverse median regression
    Eliana Christou
    Statistical Papers, 2020, 61 : 1799 - 1818
  • [7] Robust dimension reduction using sliced inverse median regression
    Christou, Eliana
    STATISTICAL PAPERS, 2020, 61 (05) : 1799 - 1818
  • [8] A REVIEW ON SLICED INVERSE REGRESSION, SUFFICIENT DIMENSION REDUCTION, AND APPLICATIONS
    Huang, Ming-Yueh
    Hung, Hung
    STATISTICA SINICA, 2022, 32 : 2297 - 2314
  • [9] Online Sufficient Dimension Reduction Through Sliced Inverse Regression
    Cai, Zhanrui
    Li, Runze
    Zhu, Liping
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [10] Sliced regression for dimension reduction
    Wang, Hansheng
    Xia, Yingcun
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2008, 103 (482) : 811 - 821