Efficiency analysis on a truncated Newton method with preconditioned conjugate gradient technique for optimization

被引:0
|
作者
Zhang, JZ [1 ]
Deng, NY [1 ]
Wang, ZZ [1 ]
机构
[1] City Univ Hong Kong, Dept Math, Hong Kong, Hong Kong, Peoples R China
关键词
unconstrained optimization; Newton's method; preconditioned conjugate gradient method; efficiency coefficient;
D O I
暂无
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
It has been shown by a large amount of numerical experiments that among the local algorithms for solving unconstrained optimization problems, the truncated Newton method with preconditioned conjugate gradient (PCG) subiterations is very efficient. In this paper, we investigate its efficiency from theoretical point of view. The question is, compared with Newton's method with Cholesky factorization, how much it is more efficient in theory. We give a quantitative answer by constructing a truncated Newton method with PCG subiterations -- Algorithm II below. Suppose Newton's method is convergent with a one-step, Q-order alpha rate (alpha greater than or equal to 2). We first prove that Algorithm II has the same convergence rate. We then study its average Dumber of arithmetic operations per step and the corresponding number which Newton's method needs. An upper bound for the ratio of these two numbers is obtained. This upper bound is a quantitative estimate of the saving which Algorithm II can achieve from theoretical point of view. Its values, which are listed in the paper, show that the saving is rather remarkable.
引用
收藏
页码:383 / 416
页数:34
相关论文
共 50 条
  • [1] A truncated conjugate gradient method with an inexact Gauss-Newton technique for solving nonlinear systems
    Yong Zhang
    Detong Zhu
    Journal of Applied Mathematics and Computing, 2012, 38 (1-2) : 551 - 564
  • [2] A truncated conjugate gradient method with an inexact Gauss-Newton technique for solving nonlinear systems
    Zhang, Yong
    Zhu, Detong
    JOURNAL OF APPLIED MATHEMATICS AND COMPUTING, 2012, 38 (1-2) : 551 - 564
  • [3] Stochastic optimization using the stochastic preconditioned conjugate gradient method
    Oakley, DR
    Sues, RH
    AIAA JOURNAL, 1996, 34 (09) : 1969 - 1971
  • [4] Preconditioned conjugate gradient method on the hypercube
    Abe, G.
    Hane, K.
    Conference on Hypercube Concurrent Computers and Applications, 1988,
  • [5] SAOR preconditioned conjugate gradient method
    Wang, Jianguo
    Meng, Guoyan
    IITA 2007: WORKSHOP ON INTELLIGENT INFORMATION TECHNOLOGY APPLICATION, PROCEEDINGS, 2007, : 331 - 334
  • [6] Fast Variable Preconditioned Conjugate Gradient Method Using Deflation technique
    Watanabe, Kota
    Sakai, Yuki
    2016 IEEE CONFERENCE ON ELECTROMAGNETIC FIELD COMPUTATION (CEFC), 2016,
  • [7] AOR Preconditioned Conjugate Gradient Method
    Wang, Jianguo
    Zhao, Qingshan
    ADVANCES IN MATRIX THEORY AND ITS APPLICATIONS, VOL 1: PROCEEDINGS OF THE EIGHTH INTERNATIONAL CONFERENCE ON MATRIX THEORY AND ITS APPLICATIONS, 2008, : 258 - 261
  • [8] On the truncated conjugate gradient method
    Yuan, Y
    MATHEMATICAL PROGRAMMING, 2000, 87 (03) : 561 - 573
  • [9] On the truncated conjugate gradient method
    Y. Yuan
    Mathematical Programming, 2000, 87 : 561 - 573
  • [10] Guardband Optimization for the Preconditioned Conjugate Gradient Algorithm
    Lylina, Natalia
    Holst, Stefan
    Jafarzadeh, Hanieh
    Kourfali, Alexandra
    Wunderlich, Hans-Joachim
    2023 53RD ANNUAL IEEE/IFIP INTERNATIONAL CONFERENCE ON DEPENDABLE SYSTEMS AND NETWORKS WORKSHOPS, DSN-W, 2023, : 195 - 198