A Riemannian Rank-Adaptive Method for Higher-Order Tensor Completion in the Tensor-Train Format

被引:0
|
作者
Vermeylen, Charlotte [1 ]
Van Barel, Marc [2 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn ESAT, Leuven, Belgium
[2] Katholieke Univ Leuven, Dept Comp Sci, Leuven, Belgium
关键词
optimization; tangent cone; tensor completion; tensor-train manifold; OPTIMIZATION;
D O I
10.1002/nla.2606
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A new Riemannian rank adaptive method (RRAM) is proposed for the low-rank tensor completion problem (LRTCP). This problem is formulated as a least-squares optimization problem on the algebraic variety of tensors of bounded tensor-train (TT) rank. The RRAM iteratively optimizes over fixed-rank smooth manifolds using a Riemannian conjugate gradient algorithm from Steinlechner. In between, the rank is increased by computing a descent direction selected in the tangent cone to the variety. A numerical method to estimate the rank increase is proposed. This numerical method is based on a new theoretical result for the low-rank tensor approximation problem and a definition of an estimated TT-rank. When the iterate comes close to a lower-rank set, the RRAM decreases the rank based on the TT-rounding algorithm from Oseledets and a definition of a numerical rank. It is shown that the TT-rounding algorithm can be considered an approximate projection onto the lower-rank set, which satisfies a certain angle condition to ensure that the image is sufficiently close to that of an exact projection. Several numerical experiments illustrate the use of the RRAM and its subroutines in Matlab. In all experiments, the proposed RRAM significantly outperforms the state-of-the-art RRAM for tensor completion in the TT format from Steinlechner in terms of computation time.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    Journal of Machine Learning Research, 2022, 23
  • [2] Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23 : 1 - 77
  • [3] Stable ALS approximation in the TT-format for rank-adaptive tensor completion
    Grasedyck, Lars
    Kraemer, Sebastian
    NUMERISCHE MATHEMATIK, 2019, 143 (04) : 855 - 904
  • [4] Stable ALS approximation in the TT-format for rank-adaptive tensor completion
    Lars Grasedyck
    Sebastian Krämer
    Numerische Mathematik, 2019, 143 : 855 - 904
  • [5] Rank-Adaptive Tensor Completion Based on Tucker Decomposition
    Liu, Siqi
    Shi, Xiaoyu
    Liao, Qifeng
    ENTROPY, 2023, 25 (02)
  • [6] Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format
    Rohrbach, Paul B.
    Dolgov, Sergey
    Grasedyck, Lars
    Scheichl, Robert
    SIAM-ASA JOURNAL ON UNCERTAINTY QUANTIFICATION, 2022, 10 (03): : 1191 - 1224
  • [7] A Riemannian rank-adaptive method for low-rank matrix completion
    Bin Gao
    P.-A. Absil
    Computational Optimization and Applications, 2022, 81 : 67 - 90
  • [8] Auto-weighted robust low-rank tensor completion via tensor-train
    Chen, Chuan
    Wu, Zhe-Bin
    Chen, Zi-Tai
    Zheng, Zi-Bin
    Zhang, Xiong-Jun
    INFORMATION SCIENCES, 2021, 567 : 100 - 115
  • [9] Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition
    Yuan, Longhao
    Zhao, Qibin
    Cao, Jianting
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 222 - 229
  • [10] A Riemannian rank-adaptive method for low-rank matrix completion
    Gao, Bin
    Absil, P-A
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2022, 81 (01) : 67 - 90