Group SLOPE Penalized Low-Rank Tensor Regression

被引:0
|
作者
Chen, Yang [1 ]
Luo, Ziyan [1 ]
机构
[1] Beijing Jiaotong Univ, Sch Math & Stat, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
difference-of-convex; false discovery rate; group sparsity; low-rankness; tensor regression; DECOMPOSITIONS; SPARSITY;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This article aims to seek a selection and estimation procedure for a class of tensor regression problems with multivariate covariates and matrix responses, which can provide theoretical guarantees for model selection in finite samples. Considering the frontal slice sparsity and low-rankness inherited in the coefficient tensor, we formulate the regression procedure as a group SLOPE penalized low-rank tensor optimization problem based on an orthogonal decomposition, namely TgSLOPE. This procedure provably controls the newly introduced tensor group false discovery rate (TgFDR), provided that the predictor matrix is columnorthogonal. Moreover, we establish the asymptotically minimax convergence with respect to the TgSLOPE estimate risk. For efficient problem resolution, we equivalently transform the TgSLOPE problem into a difference-of-convex (DC) program with the level-coercive objective function. This allows us to solve the reformulation problem of TgSLOPE by an efficient proximal DC algorithm (DCA) with global convergence. Numerical studies conducted on synthetic data and a real human brain connection data illustrate the efficacy of the proposed TgSLOPE estimation procedure.
引用
收藏
页数:30
相关论文
共 50 条
  • [41] Tensor low-rank sparse representation for tensor subspace learning
    Du, Shiqiang
    Shi, Yuqing
    Shan, Guangrong
    Wang, Weilan
    Ma, Yide
    NEUROCOMPUTING, 2021, 440 : 351 - 364
  • [42] Accelerated Low-rank Updates to Tensor Decompositions
    Baskaran, Muthu
    Langston, M. Harper
    Ramananandro, Tahina
    Bruns-Smith, David
    Henretty, Tom
    Ezick, James
    Lethin, Richard
    2016 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2016,
  • [43] Optimal Low-Rank Tensor Tree Completion
    Li, Zihan
    Zhu, Ce
    Long, Zhen
    Liu, Yipeng
    2023 IEEE 25TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING, MMSP, 2023,
  • [44] Noisy Tensor Completion via Low-Rank Tensor Ring
    Qiu, Yuning
    Zhou, Guoxu
    Zhao, Qibin
    Xie, Shengli
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 1127 - 1141
  • [45] Low-rank Tensor Restoration for ERP extraction
    Bonab, Zahra Sohrabi
    Shamsollahi, Mohammad B.
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 87
  • [46] Locally Linear Low-rank Tensor Approximation
    Ozdemir, Alp
    Iwen, Mark A.
    Aviyente, Selin
    2015 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2015, : 839 - 843
  • [47] Statistical mechanics of low-rank tensor decomposition
    Kadmon, Jonathan
    Ganguli, Surya
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [48] Low-rank optimization on Tucker tensor varieties
    Gao, Bin
    Peng, Renfeng
    Yuan, Ya-xiang
    MATHEMATICAL PROGRAMMING, 2025,
  • [49] A dual framework for low-rank tensor completion
    Nimishakavi, Madhav
    Jawanpuria, Pratik
    Mishra, Bamdev
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [50] Statistical mechanics of low-rank tensor decomposition
    Kadmon, Jonathan
    Ganguli, Surya
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):