Robust High-Dimensional Low-Rank Matrix Estimation: Optimal Rate and Data-Adaptive Tuning

被引:0
|
作者
Cui, Xiaolong [1 ]
Shi, Lei [2 ]
Zhong, Wei [3 ,4 ]
Zou, Changliang [5 ]
机构
[1] Nankai Univ, Sch Stat & Data Sci, Tianjin, Peoples R China
[2] Univ Calif Berkeley, Dept Biostat, Berkeley, CA USA
[3] Xiamen Univ, WISE, Xiamen, Peoples R China
[4] Xiamen Univ, Dept Stat & Data Sci, SOE, Xiamen, Peoples R China
[5] Nankai Univ, Sch Stat & Data Sci, LPMC KLMDASR & LEBPS, Tianjin, Peoples R China
关键词
heavy-tailed error; high dimension; low-rank matrix; non-asymptotic bounds; robustness; tuning parameter selection; PROXIMAL GRADIENT ALGORITHM; REGRESSION; COMPLETION; SELECTION; RECOVERY; MINIMIZATION; CONVERGENCE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The matrix lasso, which minimizes a least-squared loss function with the nuclear-norm regularization, offers a generally applicable paradigm for high-dimensional low-rank matrix estimation, but its efficiency is adversely affected by heavy-tailed distributions. This paper introduces a robust procedure by incorporating a Wilcoxon-type rank-based loss function with the nuclear-norm penalty for a unified high-dimensional low-rank matrix estimation framework. It includes matrix regression, multivariate regression and matrix completion as special examples. This procedure enjoys several appealing features. First, it relaxes the distributional conditions on random errors from sub-exponential or sub-Gaussian to more general distributions and thus it is robust with substantial efficiency gain for heavy-tailed random errors. Second, as the gradient function of the rank-based loss function is completely pivotal, it overcomes the challenge of tuning parameter selection and substantially saves the computation time by using an easily simulated tuning parameter. Third, we theoretically establish non-asymptotic error bounds with a nearly-oracle rate for the new estimator. Numerical results indicate that the new estimator can be highly competitive among existing methods, especially for heavy-tailed or skewed errors.
引用
收藏
页数:57
相关论文
共 50 条
  • [31] Adaptive low-rank approximation and denoised Monte Carlo approach for high-dimensional Lindblad equations
    Le Bris, C.
    Rouchon, P.
    Roussel, J.
    PHYSICAL REVIEW A, 2015, 92 (06):
  • [32] ADAPTIVE DETECTION USING LOW-RANK APPROXIMATION TO A DATA MATRIX
    KIRSTEINS, IP
    TUFTS, DW
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 1994, 30 (01) : 55 - 67
  • [33] Data-adaptive test for high-dimensional multivariate analysis of variance problem
    Zhang, Mingjuan
    Zhou, Cheng
    He, Yong
    Liu, Bin
    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, 2018, 60 (04) : 447 - 470
  • [34] Robust and sparse correlation matrix estimation for the analysis of high-dimensional genomics data
    Serra, Angela
    Coretto, Pietro
    Fratello, Michele
    Tagliaferri, Roberto
    BIOINFORMATICS, 2018, 34 (04) : 625 - 634
  • [35] Robust estimation of a high-dimensional integrated covariance matrix
    Morimoto, Takayuki
    Nagata, Shuichi
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2017, 46 (02) : 1102 - 1112
  • [36] Optimal covariance matrix estimation for high-dimensional noise in high-frequency data
    Chang, Jinyuan
    Hu, Qiao
    Liu, Cheng
    Tang, Cheng Yong
    JOURNAL OF ECONOMETRICS, 2024, 239 (02)
  • [37] LR-GLM: High-Dimensional Bayesian Inference Using Low-Rank Data Approximations
    Trippe, Brian L.
    Huggins, Jonathan H.
    Agrawal, Raj
    Broderick, Tamara
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [38] STATISTICAL INFERENCE BASED ON ROBUST LOW-RANK DATA MATRIX APPROXIMATION
    Feng, Xingdong
    He, Xuming
    ANNALS OF STATISTICS, 2014, 42 (01): : 190 - 210
  • [39] Projection Methods for Dynamical Low-Rank Approximation of High-Dimensional Problems
    Kieri, Emil
    Vandereycken, Bart
    COMPUTATIONAL METHODS IN APPLIED MATHEMATICS, 2019, 19 (01) : 73 - 92
  • [40] High-Dimensional Multivariate Forecasting with Low-Rank Gaussian Copula Processes
    Salinas, David
    Bohlke-Schneider, Michael
    Callot, Laurent
    Medico, Roberto
    Gasthaus, Jan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32