Conjugate Gradients Acceleration of Coordinate Descent for Linear Systems

被引:0
|
作者
Gordon, Dan [1 ]
机构
[1] Univ Haifa, Dept Comp Sci, IL-34988 Haifa, Israel
关键词
Coordinate descent; CD; CGCD; CGMN; Conjugate gradients acceleration; Gauss-Seidel; Kaczmarz algorithm; Linear systems; Matrix inversion; Multiple right-hand-sides; Parallelism; EFFICIENCY;
D O I
10.1007/s10915-023-02307-1
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper introduces a conjugate gradients (CG) acceleration of the coordinate descent algorithm (CD) for linear systems. It is shown that the Kaczmarz algorithm (KACZ) can simulate CD exactly, so CD can be accelerated by CG similarly to the CG acceleration of KACZ (Bjorck and Elfving in BIT 19:145-163, 1979). Experimental results were carried out on large sets of problems of reconstructing bandlimited functions from random sampling. The randomness causes extreme variance between different instances of these problems, thus causing extreme variance in the advantage of CGCD over CD. The reduction of the number of iterations by CGCD varies from about 50-90% and beyond. The implementation of CGCD is simple. CGCD can also be used for the parallel solution of linear systems derived from partial differential equations, and for the efficient solution of multiple right-hand-side problems and matrix inversion.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] A Note on Preconditioned Conjugate Gradients for Solving Singular Systems
    Cao Zhihao(Fudan University
    数学研究与评论, 1991, (03) : 469 - 473
  • [42] CONJUGATE GRADIENTS
    ROBERTS, PD
    DAVIS, RH
    CONTROL, 1969, 13 (129): : 206 - &
  • [43] Convergence acceleration of preconditioned conjugate gradient solver based on error vector sampling for a sequence of linear systems
    Iwashita, Takeshi
    Ikehara, Kota
    Fukaya, Takeshi
    Mifune, Takeshi
    NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2023, 30 (06)
  • [44] Conjugate Gradient Acceleration of Non-Linear Smoothing Filters
    Knyazev, Andrew
    Malyshev, Alexander
    2015 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2015, : 245 - 249
  • [45] Randomized block coordinate descent method for linear ill-posed problems
    Jin, Qinian
    Liu, Duo
    INVERSE PROBLEMS, 2025, 41 (03)
  • [46] GENERALIZED LINEAR COORDINATE-DESCENT MESSAGE-PASSING FOR CONVEX OPTIMIZATION
    Zhang, Guoqiang
    Heusdens, Richard
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 2009 - 2012
  • [47] Coordinate Descent Algorithm for Ramp Loss Linear Programming Support Vector Machines
    Xiangming Xi
    Xiaolin Huang
    Johan A. K. Suykens
    Shuning Wang
    Neural Processing Letters, 2016, 43 : 887 - 903
  • [48] Analysis of the Block Coordinate Descent Method for Linear Ill-Posed Problems
    Rabanser, Simon
    Neumann, Lukas
    Haltmeier, Markus
    SIAM JOURNAL ON IMAGING SCIENCES, 2019, 12 (04): : 1808 - 1832
  • [49] Coordinate Descent Algorithm for Ramp Loss Linear Programming Support Vector Machines
    Xi, Xiangming
    Huang, Xiaolin
    Suykens, Johan A. K.
    Wang, Shuning
    NEURAL PROCESSING LETTERS, 2016, 43 (03) : 887 - 903
  • [50] Gradient descent learns linear dynamical systems
    Hardt, Moritz
    Ma, Tengyu
    Recht, Benjamin
    Journal of Machine Learning Research, 2018, 19 : 1 - 44