PRECONDITIONED LOW-RANK RIEMANNIAN OPTIMIZATION FOR LINEAR SYSTEMS WITH TENSOR PRODUCT STRUCTURE

被引:29
|
作者
Kressner, Daniel [1 ]
Steinlechner, Michael [1 ]
Vandereycken, Bart [2 ]
机构
[1] Ecole Polytech Fed Lausanne, Sect Math, MATHICSE ANCHP, CH-1015 Lausanne, Switzerland
[2] Univ Geneva, Sect Math, 2-4 Rue Lievre, CH-1211 Geneva, Switzerland
来源
SIAM JOURNAL ON SCIENTIFIC COMPUTING | 2016年 / 38卷 / 04期
关键词
tensors; tensor train; matrix product states; Riemannian optimization; low rank; high dimensionality; KRYLOV SUBSPACE METHODS; APPROXIMATION; DECOMPOSITION; COMPLETION; OPERATORS; TT;
D O I
10.1137/15M1032909
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The numerical solution of partial differential equations on high-dimensional domains gives rise to computationally challenging linear systems. When using standard discretization techniques, the size of the linear system grows exponentially with the number of dimensions, making the use of classic iterative solvers infeasible. During the last few years, low-rank tensor approaches have been developed that allow one to mitigate this curse of dimensionality by exploiting the underlying structure of the linear operator. In this work, we focus on tensors represented in the Tucker and tensor train formats. We propose two preconditioned gradient methods on the corresponding low-rank tensor manifolds: a Riemannian version of the preconditioned Richardson method as well as an approximate Newton scheme based on the Riemannian Hessian. For the latter, considerable attention is given to the efficient solution of the resulting Newton equation. In numerical experiments, we compare the efficiency of our Riemannian algorithms with other established tensor-based approaches such as a truncated preconditioned Richardson method and the alternating linear scheme. The results show that our approximate Riemannian Newton scheme is significantly faster in cases when the application of the linear operator is expensive.
引用
收藏
页码:A2018 / A2044
页数:27
相关论文
共 50 条
  • [41] ALTERNATING LINEAR SCHEME IN A BAYESIAN FRAMEWORK FOR LOW-RANK TENSOR APPROXIMATION
    Menzen, Clara
    Kok, Manon
    Batselier, Kim
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2022, 44 (03): : A1116 - A1144
  • [42] Low-rank tensor ring learning for multi-linear regression
    Liu, Jiani
    Zhu, Ce
    Long, Zhen
    Huang, Huyan
    Liu, Yipeng
    PATTERN RECOGNITION, 2021, 113
  • [43] A NEW LOW-RANK TENSOR LINEAR REGRESSION WITH APPLICATION TO DATA ANALYSIS
    Pan, Chenjian
    He, Hongjin
    Ling, Chen
    PACIFIC JOURNAL OF OPTIMIZATION, 2024, 20 (03): : 569 - 588
  • [44] Iterative tensor eigen rank minimization for low-rank tensor completion
    Su, Liyu
    Liu, Jing
    Tian, Xiaoqing
    Huang, Kaiyu
    Tan, Shuncheng
    INFORMATION SCIENCES, 2022, 616 : 303 - 329
  • [45] Sparse and Low-Rank Tensor Decomposition
    Shah, Parikshit
    Rao, Nikhil
    Tang, Gongguo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [46] NONPARAMETRIC LOW-RANK TENSOR IMPUTATION
    Bazerque, Juan Andres
    Mateos, Gonzalo
    Giannakis, Georgios B.
    2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 876 - 879
  • [47] Low-Rank Regression with Tensor Responses
    Rabusseau, Guillaume
    Kadri, Hachem
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [48] MULTIRESOLUTION LOW-RANK TENSOR FORMATS
    Mickelin, Oscar
    Karaman, Sertac
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2020, 41 (03) : 1086 - 1114
  • [49] SOLVING PHASELIFT BY LOW-RANK RIEMANNIAN OPTIMIZATION METHODS FOR COMPLEX SEMIDEFINITE CONSTRAINTS
    Huang, Wen
    Gallivan, K. A.
    Zhang, Xiangxiong
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2017, 39 (05): : B840 - B859
  • [50] LOW-RANK TENSOR HUBER REGRESSION
    Wei, Yangxin
    Luot, Ziyan
    Chen, Yang
    PACIFIC JOURNAL OF OPTIMIZATION, 2022, 18 (02): : 439 - 458