High-dimensional sufficient dimension reduction through principal projections

被引:0
|
作者
Pircalabelu, Eugen [1 ]
Artemiou, Andreas [2 ]
机构
[1] UCLouvain, Inst Stat Biostat & Actuarial Sci, Voie Roman Pays 20, B-1348 Voie Du Roman Pays, Belgium
[2] Cardiff Univ, Sch Math, Senghennydd Rd, Cardiff CF24 4AG, Wales
来源
ELECTRONIC JOURNAL OF STATISTICS | 2022年 / 16卷 / 01期
关键词
Sufficient dimension reduction; support vector machines; quadratic programming; l(1) penalized estimation; debiased estimator; SLICED INVERSE REGRESSION; CONFIDENCE-INTERVALS; STATISTICS;
D O I
10.1214/22-EJS1988
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We develop in this work a new dimension reduction method for high-dimensional settings. The proposed procedure is based on a principal support vector machine framework where principal projections are used in order to overcome the non-invertibility of the covariance matrix. Using a series of equivalences we show that one can accurately recover the central subspace using a projection on a lower dimensional subspace and then applying an l(1) penalization strategy to obtain sparse estimators of the sufficient directions. Based next on a desparsified estimator, we provide an inferential procedure for high-dimensional models that allows testing for the importance of variables in determining the sufficient direction. Theoretical properties of the methodology are illustrated and computational advantages are demonstrated with simulated and real data experiments.
引用
收藏
页码:1804 / 1830
页数:27
相关论文
共 50 条
  • [1] High-Dimensional Principal Projections
    André Mas
    Frits Ruymgaart
    Complex Analysis and Operator Theory, 2015, 9 : 35 - 63
  • [2] High-Dimensional Principal Projections
    Mas, Andre
    Ruymgaart, Frits
    COMPLEX ANALYSIS AND OPERATOR THEORY, 2015, 9 (01) : 35 - 63
  • [3] Federated Sufficient Dimension Reduction Through High-Dimensional Sparse Sliced Inverse Regression
    Cui, Wenquan
    Zhao, Yue
    Xu, Jianjun
    Cheng, Haoyang
    COMMUNICATIONS IN MATHEMATICS AND STATISTICS, 2023,
  • [4] An Improved High-Dimensional Kriging Surrogate Modeling Method through Principal Component Dimension Reduction
    Li, Yaohui
    Shi, Junjun
    Yin, Zhifeng
    Shen, Jingfang
    Wu, Yizhong
    Wang, Shuting
    MATHEMATICS, 2021, 9 (16)
  • [5] SLICING-FREE INVERSE REGRESSION IN HIGH-DIMENSIONAL SUFFICIENT DIMENSION REDUCTION
    Mai, Qing
    Shao, Xiaofeng
    Wang, Runmin
    Zhang, Xin
    STATISTICA SINICA, 2025, 35 (01) : 1 - 23
  • [6] DOUBLE-SLICING ASSISTED SUFFICIENT DIMENSION REDUCTION FOR HIGH-DIMENSIONAL CENSORED DATA
    Ding, Shanshan
    Qian, Wei
    Wang, Lan
    ANNALS OF STATISTICS, 2020, 48 (04): : 2132 - 2154
  • [7] Beyond the Third Dimension: Visualizing High-Dimensional Data with Projections
    da Silva, Renato R. O.
    Rauber, Paulo E.
    Telea, Alexandru C.
    COMPUTING IN SCIENCE & ENGINEERING, 2016, 18 (05) : 98 - 107
  • [8] Principal Components, Sufficient Dimension Reduction, and Envelopes
    Cook, R. Dennis
    ANNUAL REVIEW OF STATISTICS AND ITS APPLICATION, VOL 5, 2018, 5 : 533 - 559
  • [9] Partial sufficient dimension reduction on additive rates model for recurrent event data with high-dimensional covariates
    Zhao, Xiaobing
    Zhou, Xian
    STATISTICAL PAPERS, 2020, 61 (02) : 523 - 541
  • [10] Partial sufficient dimension reduction on additive rates model for recurrent event data with high-dimensional covariates
    Xiaobing Zhao
    Xian Zhou
    Statistical Papers, 2020, 61 : 523 - 541