Model averaging assisted sufficient dimension reduction

被引:5
|
作者
Fang, Fang [1 ]
Yu, Zhou [1 ]
机构
[1] East China Normal Univ, Sch Stat, Key Lab Adv Theory & Applicat Stat & Data Sci, MOE, Shanghai, Peoples R China
关键词
Jackknife model averaging; Ladle estimator; Mallows model averaging; Principal Hessian directions; Sliced inverse regression; Sufficient dimension reduction; SLICED INVERSE REGRESSION; PRINCIPAL HESSIAN DIRECTIONS; CENTRAL SUBSPACE; SELECTION; SHRINKAGE; INFERENCE; NUMBER;
D O I
10.1016/j.csda.2020.106993
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Sufficient dimension reduction that replaces original predictors with their low-dimensional linear combinations without loss of information is a critical tool in modern statistics and has gained considerable research momentum in the past decades since the two pioneers sliced inverse regression and principal Hessian directions. The classical sufficient dimension reduction methods do not handle sparse case well since the estimated linear reductions involve all of the original predictors. Sparse sufficient dimension reduction methods rely on sparsity assumption which may not be true in practice. Motivated by the least squares formulation of the classical sliced inverse regression and principal Hessian directions, several model averaging assisted sufficient dimension reduction methods are proposed. They are applicable to both dense and sparse cases even with weak signals since model averaging adaptively assigns weights to different candidate models. Based on the model averaging assisted sufficient dimension reduction methods, how to estimate the structural dimension is further studied. Theoretical justifications are given and empirical results show that the proposed methods compare favorably with the classical sufficient dimension reduction methods and popular sparse sufficient dimension reduction methods. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] Sparse kernel sufficient dimension reduction
    Liu, Bingyuan
    Xue, Lingzhou
    JOURNAL OF NONPARAMETRIC STATISTICS, 2024,
  • [32] Dimension reduction and model averaging for estimation of artists' age-valuation profiles
    Galbraith, John W.
    Hodgson, Douglas J.
    EUROPEAN ECONOMIC REVIEW, 2012, 56 (03) : 422 - 435
  • [33] Efficient Sparse Estimate of Sufficient Dimension Reduction in High Dimension
    Chen, Xin
    Sheng, Wenhui
    Yin, Xiangrong
    TECHNOMETRICS, 2018, 60 (02) : 161 - 168
  • [34] Using DAGs to identify the sufficient dimension reduction in the Principal Fitted Components model
    Szretter Noste, Maria Eugenia
    STATISTICS & PROBABILITY LETTERS, 2019, 145 : 317 - 320
  • [35] A model-free conditional screening approach via sufficient dimension reduction
    Huo, Lei
    Wen, Xuerong Meggie
    Yu, Zhou
    JOURNAL OF NONPARAMETRIC STATISTICS, 2020, 32 (04) : 970 - 988
  • [36] NONLINEAR INTERACTION DETECTION THROUGH MODEL-BASED SUFFICIENT DIMENSION REDUCTION
    Fan, Guoliang
    Zhu, Liping
    Ma, Shujie
    STATISTICA SINICA, 2019, 29 (02) : 917 - 937
  • [37] DOUBLE-SLICING ASSISTED SUFFICIENT DIMENSION REDUCTION FOR HIGH-DIMENSIONAL CENSORED DATA
    Ding, Shanshan
    Qian, Wei
    Wang, Lan
    ANNALS OF STATISTICS, 2020, 48 (04): : 2132 - 2154
  • [38] Tutorial: Methodologies for sufficient dimension reduction in regression
    Yoo, Jae Keun
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2016, 23 (02) : 105 - 117
  • [39] Likelihood-Based Sufficient Dimension Reduction
    Zhu, Mu
    Hastie, Trevor J.
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2010, 105 (490) : 880 - 880
  • [40] NONLINEAR SUFFICIENT DIMENSION REDUCTION FOR FUNCTIONAL DATA
    Li, Bing
    Song, Jun
    ANNALS OF STATISTICS, 2017, 45 (03): : 1059 - 1095