Low-rank optimization on Tucker tensor varieties

被引:0
|
作者
Gao, Bin [1 ]
Peng, Renfeng [1 ,2 ]
Yuan, Ya-xiang [1 ]
机构
[1] Chinese Acad Sci, Acad Math & Syst Sci, State Key Lab Sci & Engn Comp, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Low-rank optimization; Tucker decomposition; Algebraic variety; Tangent cone; Rank-adaptive strategy; RIEMANNIAN OPTIMIZATION; COMPLETION; ALGORITHMS;
D O I
10.1007/s10107-024-02186-w
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
In the realm of tensor optimization, the low-rank Tucker decomposition is crucial for reducing the number of parameters and for saving storage. We explore the geometry of Tucker tensor varieties-the set of tensors with bounded Tucker rank-which is notably more intricate than the well-explored matrix varieties. We give an explicit parametrization of the tangent cone of Tucker tensor varieties and leverage its geometry to develop provable gradient-related line-search methods for optimization on Tucker tensor varieties. In practice, low-rank tensor optimization suffers from the difficulty of choosing a reliable rank parameter. To this end, we incorporate the established geometry and propose a Tucker rank-adaptive method that aims to identify an appropriate rank with guaranteed convergence. Numerical experiments on tensor completion reveal that the proposed methods are in favor of recovering performance over other state-of-the-art methods. The rank-adaptive method performs the best across various rank parameter selections and is indeed able to find an appropriate rank.
引用
收藏
页数:51
相关论文
共 50 条
  • [41] Tensor Denoising Using Low-Rank Tensor Train Decomposition
    Gong, Xiao
    Chen, Wei
    Chen, Jie
    Ai, Bo
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1685 - 1689
  • [42] On the equivalence between low-rank matrix completion and tensor rank
    Derksen, Harm
    LINEAR & MULTILINEAR ALGEBRA, 2018, 66 (04): : 645 - 667
  • [43] CROSS: EFFICIENT LOW-RANK TENSOR COMPLETION
    Zhang, Anru
    ANNALS OF STATISTICS, 2019, 47 (02): : 936 - 964
  • [44] TENSOR QUANTILE REGRESSION WITH LOW-RANK TENSOR TRAIN ESTIMATION
    Liu, Zihuan
    Lee, Cheuk Yin
    Zhang, Heping
    ANNALS OF APPLIED STATISTICS, 2024, 18 (02): : 1294 - 1318
  • [45] Robust Low-Rank Tensor Ring Completion
    Huang, Huyan
    Liu, Yipeng
    Long, Zhen
    Zhu, Ce
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2020, 6 : 1117 - 1126
  • [46] Switching Autoregressive Low-rank Tensor Models
    Lee, Hyun Dong
    Warrington, Andrew
    Glaser, Joshua I.
    Linderman, Scott W.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [47] Efficient Perceptual Loss based on Low-rank Tucker Decomposition
    Kim, Taehyeon
    Choi, Heungjun
    Choe, Yoonsik
    2022 INTERNATIONAL CONFERENCE ON ELECTRONICS, INFORMATION, AND COMMUNICATION (ICEIC), 2022,
  • [48] A Weighted Tensor Factorization Method for Low-rank Tensor Completion
    Cheng, Miaomiao
    Jing, Liping
    Ng, Michael K.
    2019 IEEE FIFTH INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM 2019), 2019, : 30 - 38
  • [49] Low-Rank Tensor Thresholding Ridge Regression
    Guo, Kailing
    Zhang, Tong
    Xu, Xiangmin
    Xing, Xiaofen
    IEEE ACCESS, 2019, 7 : 153761 - 153772
  • [50] Tensor low-rank sparse representation for tensor subspace learning
    Du, Shiqiang
    Shi, Yuqing
    Shan, Guangrong
    Wang, Weilan
    Ma, Yide
    NEUROCOMPUTING, 2021, 440 : 351 - 364