Model averaging assisted sufficient dimension reduction

被引:5
|
作者
Fang, Fang [1 ]
Yu, Zhou [1 ]
机构
[1] East China Normal Univ, Sch Stat, Key Lab Adv Theory & Applicat Stat & Data Sci, MOE, Shanghai, Peoples R China
关键词
Jackknife model averaging; Ladle estimator; Mallows model averaging; Principal Hessian directions; Sliced inverse regression; Sufficient dimension reduction; SLICED INVERSE REGRESSION; PRINCIPAL HESSIAN DIRECTIONS; CENTRAL SUBSPACE; SELECTION; SHRINKAGE; INFERENCE; NUMBER;
D O I
10.1016/j.csda.2020.106993
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Sufficient dimension reduction that replaces original predictors with their low-dimensional linear combinations without loss of information is a critical tool in modern statistics and has gained considerable research momentum in the past decades since the two pioneers sliced inverse regression and principal Hessian directions. The classical sufficient dimension reduction methods do not handle sparse case well since the estimated linear reductions involve all of the original predictors. Sparse sufficient dimension reduction methods rely on sparsity assumption which may not be true in practice. Motivated by the least squares formulation of the classical sliced inverse regression and principal Hessian directions, several model averaging assisted sufficient dimension reduction methods are proposed. They are applicable to both dense and sparse cases even with weak signals since model averaging adaptively assigns weights to different candidate models. Based on the model averaging assisted sufficient dimension reduction methods, how to estimate the structural dimension is further studied. Theoretical justifications are given and empirical results show that the proposed methods compare favorably with the classical sufficient dimension reduction methods and popular sparse sufficient dimension reduction methods. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Sufficient dimension reduction for compositional data
    Tomassi, Diego
    Forzani, Liliana
    Duarte, Sabrina
    Pfeiffer, Ruth M.
    BIOSTATISTICS, 2021, 22 (04) : 687 - 705
  • [22] A unified approach to sufficient dimension reduction
    Xue, Yuan
    Wang, Qin
    Yin, Xiangrong
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2018, 197 : 168 - 179
  • [23] Diagnostic studies in sufficient dimension reduction
    Chen, Xin
    Cook, R. Dennis
    Zou, Changliang
    BIOMETRIKA, 2015, 102 (03) : 545 - 558
  • [24] Sparse sufficient dimension reduction with heteroscedasticity
    Cheng, Haoyang
    Cui, Wenquan
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2022, 20 (01)
  • [25] Sufficient dimension reduction and prediction in regression
    Adragni, Kofi P.
    Cook, R. Dennis
    PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 2009, 367 (1906): : 4385 - 4405
  • [26] DEEP NONLINEAR SUFFICIENT DIMENSION REDUCTION
    Chen, YinFeng
    Jiao, YuLing
    Qiu, Rui
    Hu, Zhou
    ANNALS OF STATISTICS, 2024, 52 (03): : 1201 - 1226
  • [27] Sufficient dimension reduction and graphics in regression
    Chiaromonte, F
    Cook, RD
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2002, 54 (04) : 768 - 795
  • [28] A Note on Bootstrapping in Sufficient Dimension Reduction
    Yoo, Jae Keun
    Jeong, Sun
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2015, 22 (03) : 285 - 294
  • [29] Sufficient Dimension Reduction and Graphics in Regression
    Francesca Chiaromonte
    R. Dennis Cook
    Annals of the Institute of Statistical Mathematics, 2002, 54 : 768 - 795
  • [30] Sufficient dimension reduction and prediction in regression
    University of Minnesota, School of Statistics, 313 Ford Hall, 224 Church Street Southeast, Minneapolis, MN 55455, United States
    Philos. Trans. R. Soc. A Math. Phys. Eng. Sci., 1600, 1906 (4385-4405):