Group SLOPE Penalized Low-Rank Tensor Regression

被引:0
|
作者
Chen, Yang [1 ]
Luo, Ziyan [1 ]
机构
[1] Beijing Jiaotong Univ, Sch Math & Stat, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
difference-of-convex; false discovery rate; group sparsity; low-rankness; tensor regression; DECOMPOSITIONS; SPARSITY;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This article aims to seek a selection and estimation procedure for a class of tensor regression problems with multivariate covariates and matrix responses, which can provide theoretical guarantees for model selection in finite samples. Considering the frontal slice sparsity and low-rankness inherited in the coefficient tensor, we formulate the regression procedure as a group SLOPE penalized low-rank tensor optimization problem based on an orthogonal decomposition, namely TgSLOPE. This procedure provably controls the newly introduced tensor group false discovery rate (TgFDR), provided that the predictor matrix is columnorthogonal. Moreover, we establish the asymptotically minimax convergence with respect to the TgSLOPE estimate risk. For efficient problem resolution, we equivalently transform the TgSLOPE problem into a difference-of-convex (DC) program with the level-coercive objective function. This allows us to solve the reformulation problem of TgSLOPE by an efficient proximal DC algorithm (DCA) with global convergence. Numerical studies conducted on synthetic data and a real human brain connection data illustrate the efficacy of the proposed TgSLOPE estimation procedure.
引用
收藏
页数:30
相关论文
共 50 条
  • [31] Non-convex projected gradient descent for generalized low-rank tensor regression
    Chen, Han
    Raskutti, Garvesh
    Yuan, Ming
    Journal of Machine Learning Research, 2019, 20
  • [32] Non-Convex Projected Gradient Descent for Generalized Low-Rank Tensor Regression
    Chen, Han
    Raskutti, Garvesh
    Yuan, Ming
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [33] Tensor Denoising Using Low-Rank Tensor Train Decomposition
    Gong, Xiao
    Chen, Wei
    Chen, Jie
    Ai, Bo
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1685 - 1689
  • [34] On the equivalence between low-rank matrix completion and tensor rank
    Derksen, Harm
    LINEAR & MULTILINEAR ALGEBRA, 2018, 66 (04): : 645 - 667
  • [35] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [36] CROSS: EFFICIENT LOW-RANK TENSOR COMPLETION
    Zhang, Anru
    ANNALS OF STATISTICS, 2019, 47 (02): : 936 - 964
  • [37] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [38] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [39] Switching Autoregressive Low-rank Tensor Models
    Lee, Hyun Dong
    Warrington, Andrew
    Glaser, Joshua I.
    Linderman, Scott W.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [40] A Weighted Tensor Factorization Method for Low-rank Tensor Completion
    Cheng, Miaomiao
    Jing, Liping
    Ng, Michael K.
    2019 IEEE FIFTH INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM 2019), 2019, : 30 - 38