A Riemannian Rank-Adaptive Method for Higher-Order Tensor Completion in the Tensor-Train Format

被引:0
|
作者
Vermeylen, Charlotte [1 ]
Van Barel, Marc [2 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn ESAT, Leuven, Belgium
[2] Katholieke Univ Leuven, Dept Comp Sci, Leuven, Belgium
关键词
optimization; tangent cone; tensor completion; tensor-train manifold; OPTIMIZATION;
D O I
10.1002/nla.2606
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A new Riemannian rank adaptive method (RRAM) is proposed for the low-rank tensor completion problem (LRTCP). This problem is formulated as a least-squares optimization problem on the algebraic variety of tensors of bounded tensor-train (TT) rank. The RRAM iteratively optimizes over fixed-rank smooth manifolds using a Riemannian conjugate gradient algorithm from Steinlechner. In between, the rank is increased by computing a descent direction selected in the tangent cone to the variety. A numerical method to estimate the rank increase is proposed. This numerical method is based on a new theoretical result for the low-rank tensor approximation problem and a definition of an estimated TT-rank. When the iterate comes close to a lower-rank set, the RRAM decreases the rank based on the TT-rounding algorithm from Oseledets and a definition of a numerical rank. It is shown that the TT-rounding algorithm can be considered an approximate projection onto the lower-rank set, which satisfies a certain angle condition to ensure that the image is sufficiently close to that of an exact projection. Several numerical experiments illustrate the use of the RRAM and its subroutines in Matlab. In all experiments, the proposed RRAM significantly outperforms the state-of-the-art RRAM for tensor completion in the TT format from Steinlechner in terms of computation time.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Riemannian conjugate gradient method for low-rank tensor completion
    Shan-Qi Duan
    Xue-Feng Duan
    Chun-Mei Li
    Jiao-Fen Li
    Advances in Computational Mathematics, 2023, 49
  • [22] TWO HEURISTICS SOLVING LOW TENSOR TRAIN RANK TENSOR COMPLETION
    Tang, Yunfei
    Yang, Qingzhi
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2025, 21 (02) : 925 - 954
  • [23] TWO HEURISTICS SOLVING LOW TENSOR TRAIN RANK TENSOR COMPLETION
    Tang, Yunfei
    Yang, Qingzhi
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2025, 21 (02) : 925 - 954
  • [24] ON THE RELAXATION OF ORTHOGONAL TENSOR RANK AND ITS NONCONVEX RIEMANNIAN OPTIMIZATION FOR TENSOR COMPLETION
    Ozawa, Keisuke
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3628 - 3632
  • [25] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [26] Subspace Methods with Local Refinements for Eigenvalue Computation Using Low-Rank Tensor-Train Format
    Zhang, Junyu
    Wen, Zaiwen
    Zhang, Yin
    JOURNAL OF SCIENTIFIC COMPUTING, 2017, 70 (02) : 478 - 499
  • [27] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [28] Generalized Higher-Order Orthogonal Iteration for Tensor Decomposition and Completion
    Liu, Yuanyuan
    Shang, Fanhua
    Fan, Wei
    Cheng, James
    Cheng, Hong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [29] Scaled Coupled Norms and Coupled Higher-Order Tensor Completion
    Wimalawarne, Kishan
    Yamada, Makoto
    Mamitsuka, Hiroshi
    NEURAL COMPUTATION, 2020, 32 (02) : 447 - 484
  • [30] Challenging the Curse of Dimensionality in Multidimensional Numerical Integration by Using a Low-Rank Tensor-Train Format
    Alexandrov, Boian
    Manzini, Gianmarco
    Skau, Erik W.
    Truong, Phan Minh Duc
    Vuchov, Radoslav G.
    MATHEMATICS, 2023, 11 (03)