PRECONDITIONED LOW-RANK RIEMANNIAN OPTIMIZATION FOR LINEAR SYSTEMS WITH TENSOR PRODUCT STRUCTURE

被引:29
|
作者
Kressner, Daniel [1 ]
Steinlechner, Michael [1 ]
Vandereycken, Bart [2 ]
机构
[1] Ecole Polytech Fed Lausanne, Sect Math, MATHICSE ANCHP, CH-1015 Lausanne, Switzerland
[2] Univ Geneva, Sect Math, 2-4 Rue Lievre, CH-1211 Geneva, Switzerland
来源
SIAM JOURNAL ON SCIENTIFIC COMPUTING | 2016年 / 38卷 / 04期
关键词
tensors; tensor train; matrix product states; Riemannian optimization; low rank; high dimensionality; KRYLOV SUBSPACE METHODS; APPROXIMATION; DECOMPOSITION; COMPLETION; OPERATORS; TT;
D O I
10.1137/15M1032909
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The numerical solution of partial differential equations on high-dimensional domains gives rise to computationally challenging linear systems. When using standard discretization techniques, the size of the linear system grows exponentially with the number of dimensions, making the use of classic iterative solvers infeasible. During the last few years, low-rank tensor approaches have been developed that allow one to mitigate this curse of dimensionality by exploiting the underlying structure of the linear operator. In this work, we focus on tensors represented in the Tucker and tensor train formats. We propose two preconditioned gradient methods on the corresponding low-rank tensor manifolds: a Riemannian version of the preconditioned Richardson method as well as an approximate Newton scheme based on the Riemannian Hessian. For the latter, considerable attention is given to the efficient solution of the resulting Newton equation. In numerical experiments, we compare the efficiency of our Riemannian algorithms with other established tensor-based approaches such as a truncated preconditioned Richardson method and the alternating linear scheme. The results show that our approximate Riemannian Newton scheme is significantly faster in cases when the application of the linear operator is expensive.
引用
收藏
页码:A2018 / A2044
页数:27
相关论文
共 50 条
  • [21] LOW-RANK APPROXIMATE INVERSE FOR PRECONDITIONING TENSOR-STRUCTURED LINEAR SYSTEMS
    Giraldi, L.
    Nouy, A.
    Legrain, G.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2014, 36 (04): : A1850 - A1870
  • [22] Low-rank tensor structure of linear diffusion operators in the TT and QTT formats
    Kazeev, Vladimir
    Reichmann, Oleg
    Schwab, Christoph
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2013, 438 (11) : 4204 - 4221
  • [23] Greedy low-rank approximation in Tucker format of solutions of tensor linear systems
    Georgieva, I
    Hofreither, C.
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2019, 358 : 206 - 220
  • [24] Robust Low-Rank and Sparse Tensor Decomposition for Low-Rank Tensor Completion
    Shi, Yuqing
    Du, Shiqiang
    Wang, Weilan
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 7138 - 7143
  • [25] Optimality conditions for Tucker low-rank tensor optimization
    Luo, Ziyan
    Qi, Liqun
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2023, 86 (03) : 1275 - 1298
  • [26] Optimality conditions for Tucker low-rank tensor optimization
    Ziyan Luo
    Liqun Qi
    Computational Optimization and Applications, 2023, 86 : 1275 - 1298
  • [27] Greedy rank updates combined with Riemannian descent methods for low-rank optimization
    Uschmajew, Andre
    Vandereycken, Bart
    2015 INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2015, : 420 - 424
  • [28] Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
    Boumal, Nicolas
    Absil, P. -A.
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2015, 475 : 200 - 239
  • [29] Existence and computation of low Kronecker-rank approximations for large linear systems of tensor product structure
    Grasedyck, L
    COMPUTING, 2004, 72 (3-4) : 247 - 265
  • [30] Existence and Computation of Low Kronecker-Rank Approximations for Large Linear Systems of Tensor Product Structure
    L. Grasedyck
    Computing, 2004, 72 : 247 - 265