DOUBLE-SLICING ASSISTED SUFFICIENT DIMENSION REDUCTION FOR HIGH-DIMENSIONAL CENSORED DATA

被引:6
|
作者
Ding, Shanshan [1 ]
Qian, Wei [1 ]
Wang, Lan [2 ]
机构
[1] Univ Delaware, Dept Appl Econ & Stat, Newark, DE 19716 USA
[2] Univ Miami, Dept Management Sci, Coral Gables, FL 33124 USA
来源
ANNALS OF STATISTICS | 2020年 / 48卷 / 04期
关键词
Central subspace; sufficient dimension reduction; variable selection; censored data; ultrahigh dimension; nonparametric estimation; SLICED INVERSE REGRESSION; PROPORTIONAL HAZARDS MODEL; VARIABLE SELECTION; ADAPTIVE LASSO; REGULARIZATION; CONSISTENCY; PREDICTION; SPARSITY;
D O I
10.1214/19-AOS1880
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper provides a unified framework and an efficient algorithm for analyzing high-dimensional survival data under weak modeling assumptions. In particular, it imposes neither parametric distributional assumption nor linear regression assumption. It only assumes that the survival time T depends on a high-dimensional covariate vector X through low-dimensional linear combinations of covariates Gamma(T) X. The censoring time is allowed to be conditionally independent of the survival time given the covariates. This general framework includes many popular parametric and semiparametric survival regression models as special cases. The proposed algorithm produces a number of practically useful outputs with theoretical guarantees, including a consistent estimate of the sufficient dimension reduction subspace of T vertical bar X, a uniformly consistent Kaplan-Meier-type estimator of the conditional distribution function of T and a consistent estimator of the conditional quantile survival time. Our asymptotic results significantly extend the classical theory of sufficient dimension reduction for censored data (particularly that of Li, Wang and Chen in Ann. Statist. 27 (1999) 1-23) and the celebrated nonparametric Kaplan-Meier estimator to the setting where the number of covariates p diverges exponentially fast with the sample size n. We demonstrate the promising performance of the proposed new estimators through simulations and a real data example.
引用
收藏
页码:2132 / 2154
页数:23
相关论文
共 50 条
  • [21] Sufficient dimension reduction and prediction through cumulative slicing PFC
    Xu, Xinyi
    Li, Xiangjie
    Zhang, Jingxiao
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2018, 88 (06) : 1172 - 1190
  • [22] Dimension reduction for censored regression data
    Li, KC
    Wang, JL
    Chen, CH
    ANNALS OF STATISTICS, 1999, 27 (01): : 1 - 23
  • [23] GRASSMANNIAN DIFFUSION MAPS-BASED DIMENSION REDUCTION AND CLASSIFICATION FOR HIGH-DIMENSIONAL DATA
    Dos Santos, Ketson R.
    Giovanis, Dimitrios G.
    Shields, Michael D.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2022, 44 (02): : B250 - B274
  • [24] A Hybrid Dimension Reduction Based Linear Discriminant Analysis for Classification of High-Dimensional Data
    Zorarpaci, Ezgi
    2021 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC 2021), 2021, : 1028 - 1036
  • [25] Mining High-Dimensional CyTOF Data: Concurrent Gating, Outlier Removal, and Dimension Reduction
    Lee, Sharon X.
    DATABASES THEORY AND APPLICATIONS, ADC 2017, 2017, 10538 : 178 - 189
  • [26] Dimension Reduction for High-dimensional Small Counts with KL Divergence
    Ling, Yurong
    Xue, Jing-Hao
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 1210 - 1220
  • [27] Optimal dimension reduction for high-dimensional and functional time series
    Hallin M.
    Hörmann S.
    Lippi M.
    Statistical Inference for Stochastic Processes, 2018, 21 (2) : 385 - 398
  • [28] On the performance of adaptive preprocessing technique in analyzing high-dimensional censored data
    Khan, Md Hasinur Rahaman
    BIOMETRICAL JOURNAL, 2018, 60 (04) : 687 - 702
  • [29] Model averaging assisted sufficient dimension reduction
    Fang, Fang
    Yu, Zhou
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2020, 152
  • [30] Manifold-based denoising, outlier detection, and dimension reduction algorithm for high-dimensional data
    Zhao, Guanghua
    Yang, Tao
    Fu, Dongmei
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (11) : 3923 - 3942