PRECONDITIONED LOW-RANK RIEMANNIAN OPTIMIZATION FOR LINEAR SYSTEMS WITH TENSOR PRODUCT STRUCTURE

被引:29
|
作者
Kressner, Daniel [1 ]
Steinlechner, Michael [1 ]
Vandereycken, Bart [2 ]
机构
[1] Ecole Polytech Fed Lausanne, Sect Math, MATHICSE ANCHP, CH-1015 Lausanne, Switzerland
[2] Univ Geneva, Sect Math, 2-4 Rue Lievre, CH-1211 Geneva, Switzerland
来源
SIAM JOURNAL ON SCIENTIFIC COMPUTING | 2016年 / 38卷 / 04期
关键词
tensors; tensor train; matrix product states; Riemannian optimization; low rank; high dimensionality; KRYLOV SUBSPACE METHODS; APPROXIMATION; DECOMPOSITION; COMPLETION; OPERATORS; TT;
D O I
10.1137/15M1032909
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The numerical solution of partial differential equations on high-dimensional domains gives rise to computationally challenging linear systems. When using standard discretization techniques, the size of the linear system grows exponentially with the number of dimensions, making the use of classic iterative solvers infeasible. During the last few years, low-rank tensor approaches have been developed that allow one to mitigate this curse of dimensionality by exploiting the underlying structure of the linear operator. In this work, we focus on tensors represented in the Tucker and tensor train formats. We propose two preconditioned gradient methods on the corresponding low-rank tensor manifolds: a Riemannian version of the preconditioned Richardson method as well as an approximate Newton scheme based on the Riemannian Hessian. For the latter, considerable attention is given to the efficient solution of the resulting Newton equation. In numerical experiments, we compare the efficiency of our Riemannian algorithms with other established tensor-based approaches such as a truncated preconditioned Richardson method and the alternating linear scheme. The results show that our approximate Riemannian Newton scheme is significantly faster in cases when the application of the linear operator is expensive.
引用
收藏
页码:A2018 / A2044
页数:27
相关论文
共 50 条
  • [1] Low-rank tensor completion by Riemannian optimization
    Daniel Kressner
    Michael Steinlechner
    Bart Vandereycken
    BIT Numerical Mathematics, 2014, 54 : 447 - 468
  • [2] Low-rank tensor completion by Riemannian optimization
    Kressner, Daniel
    Steinlechner, Michael
    Vandereycken, Bart
    BIT NUMERICAL MATHEMATICS, 2014, 54 (02) : 447 - 468
  • [3] Tensor Completion using Low-Rank Tensor Train Decomposition by Riemannian Optimization
    Wang, Junli
    Zhao, Guangshe
    Wang, Dingheng
    Li, Guoqi
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 3380 - 3384
  • [4] Generalized Low-Rank Plus Sparse Tensor Estimation by Fast Riemannian Optimization
    Cai, Jian-Feng
    Li, Jingyang
    Xia, Dong
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023, 118 (544) : 2588 - 2604
  • [5] A PRECONDITIONED RIEMANNIAN GRADIENT DESCENT ALGORITHM FOR LOW-RANK MATRIX RECOVERY
    Bian, Fengmiao
    Cai, Jian-feng
    Zhang, Rui
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2024, 45 (04) : 2075 - 2103
  • [6] LOW-RANK MATRIX COMPLETION BY RIEMANNIAN OPTIMIZATION
    Vandereycken, Bart
    SIAM JOURNAL ON OPTIMIZATION, 2013, 23 (02) : 1214 - 1236
  • [7] AUTOMATIC DIFFERENTIATION FOR RIEMANNIAN OPTIMIZATION ON LOW-RANK MATRIX AND TENSOR-TRAIN MANIFOLDS
    NOVIKOV, A. L. E. X. A. N. D. E. R.
    RAKHUBA, M. A. X. I. M.
    OSELEDETS, I. V. A. N.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2022, 44 (02): : A843 - A869
  • [8] A Riemannian rank-adaptive method for low-rank optimization
    Zhou, Guifang
    Huang, Wen
    Gallivan, Kyle A.
    Van Dooren, Paul
    Absil, Pierre-Antoine
    NEUROCOMPUTING, 2016, 192 : 72 - 80
  • [9] Low-rank tensor completion: a Riemannian manifold preconditioning approach
    Kasai, Hiroyuki
    Mishra, Bamdev
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [10] Riemannian conjugate gradient method for low-rank tensor completion
    Duan, Shan-Qi
    Duan, Xue-Feng
    Li, Chun-Mei
    Li, Jiao-Fen
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2023, 49 (03)