PRECONDITIONED LOW-RANK RIEMANNIAN OPTIMIZATION FOR LINEAR SYSTEMS WITH TENSOR PRODUCT STRUCTURE

被引:29
|
作者
Kressner, Daniel [1 ]
Steinlechner, Michael [1 ]
Vandereycken, Bart [2 ]
机构
[1] Ecole Polytech Fed Lausanne, Sect Math, MATHICSE ANCHP, CH-1015 Lausanne, Switzerland
[2] Univ Geneva, Sect Math, 2-4 Rue Lievre, CH-1211 Geneva, Switzerland
来源
SIAM JOURNAL ON SCIENTIFIC COMPUTING | 2016年 / 38卷 / 04期
关键词
tensors; tensor train; matrix product states; Riemannian optimization; low rank; high dimensionality; KRYLOV SUBSPACE METHODS; APPROXIMATION; DECOMPOSITION; COMPLETION; OPERATORS; TT;
D O I
10.1137/15M1032909
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The numerical solution of partial differential equations on high-dimensional domains gives rise to computationally challenging linear systems. When using standard discretization techniques, the size of the linear system grows exponentially with the number of dimensions, making the use of classic iterative solvers infeasible. During the last few years, low-rank tensor approaches have been developed that allow one to mitigate this curse of dimensionality by exploiting the underlying structure of the linear operator. In this work, we focus on tensors represented in the Tucker and tensor train formats. We propose two preconditioned gradient methods on the corresponding low-rank tensor manifolds: a Riemannian version of the preconditioned Richardson method as well as an approximate Newton scheme based on the Riemannian Hessian. For the latter, considerable attention is given to the efficient solution of the resulting Newton equation. In numerical experiments, we compare the efficiency of our Riemannian algorithms with other established tensor-based approaches such as a truncated preconditioned Richardson method and the alternating linear scheme. The results show that our approximate Riemannian Newton scheme is significantly faster in cases when the application of the linear operator is expensive.
引用
收藏
页码:A2018 / A2044
页数:27
相关论文
共 50 条
  • [31] Solving stochastic systems with low-rank tensor compression
    Matthies, Hermann G.
    Zander, Elmar
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2012, 436 (10) : 3819 - 3838
  • [32] Block Row Kronecker-Structured Linear Systems With a Low-Rank Tensor Solution
    Hendrikx, Stijn
    De Lathauwer, Lieven
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2022, 8
  • [33] Low-rank Tensor Tracking
    Javed, Sajid
    Dias, Jorge
    Werghi, Naoufel
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 605 - 614
  • [34] Fast randomized tensor singular value thresholding for low-rank tensor optimization
    Che, Maolin
    Wang, Xuezhong
    Wei, Yimin
    Zhao, Xile
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2022, 29 (06)
  • [35] Low-Rank Riemannian Optimization for Graph-Based Clustering Applications
    Douik, Ahmed
    Hassibi, Babak
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (09) : 5133 - 5148
  • [36] Low-Rank Dynamic Mode Decomposition using Riemannian Manifold Optimization
    Sashittal, Palash
    Bodony, Daniel J.
    2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 2265 - 2270
  • [37] Low-Rank Tensor Completion by Approximating the Tensor Average Rank
    Wang, Zhanliang
    Dong, Junyu
    Liu, Xinguo
    Zeng, Xueying
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4592 - 4600
  • [38] A RIEMANNIAN OPTIMIZATION APPROACH FOR COMPUTING LOW-RANK SOLUTIONS OF LYAPUNOV EQUATIONS
    Vandereycken, Bart
    Vandewalle, Stefan
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2010, 31 (05) : 2553 - 2579
  • [39] Tensor Factorization for Low-Rank Tensor Completion
    Zhou, Pan
    Lu, Canyi
    Lin, Zhouchen
    Zhang, Chao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (03) : 1152 - 1163
  • [40] Low-Rank Tensor Completion Method for Implicitly Low-Rank Visual Data
    Ji, Teng-Yu
    Zhao, Xi-Le
    Sun, Dong-Lin
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1162 - 1166