Accelerated Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization

被引:0
|
作者
Andrei, Neculai [1 ,2 ]
机构
[1] Ctr Adv Modeling & Optimizat, Res Inst Informat, Bucharest 1, Romania
[2] Acad Romanian Scientists, Bucharest 5, Romania
来源
STUDIES IN INFORMATICS AND CONTROL | 2009年 / 18卷 / 03期
关键词
Unconstrained optimization; conjugate gradient method; Newton direction; modified secant condition; numerical comparisons; GLOBAL CONVERGENCE PROPERTIES; PERFORMANCE; DESCENT;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Conjugate gradient algorithms are very powerful methods for solving large-scale unconstrained optimization problems characterized by low memory requirements and strong local and global convergence properties. Over 25 variants of different conjugate gradient methods are known. In this paper we propose a fundamentally different method, in which the well known parameter beta(k) is computed by an approximation of the Hessian / vector product through modified secant condition. For search direction computation, the method takes both the available gradient and the function values information in two successive iteration points and achieves high-order accuracy in approximating the second-order curvature of the minimizing function. For steplength computation the method uses the advantage that the step lengths in conjugate gradient algorithms may differ from I by two order of magnitude and tend to vary in an unpredictable manner. Thus, we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with some conjugate gradient algorithms (including CG_DESCENT by Hager and Zhang [19], CONMIN by Shanno and Phua [29], SCALCG by Andrei [3-5], or LBFGS by Liu and Nocedal [22]) using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the suggested algorithm outperforms the known conjugate gradient algorithms and LBFGS.
引用
收藏
页码:211 / 232
页数:22
相关论文
共 50 条
  • [41] A Modified Descent Spectral Conjugate Gradient Method for Unconstrained Optimization
    Nezhadhosein, Saeed
    IRANIAN JOURNAL OF SCIENCE AND TECHNOLOGY TRANSACTION A-SCIENCE, 2021, 45 (01): : 209 - 220
  • [42] A Modified Form of Conjugate Gradient Method for Unconstrained Optimization Problems
    Ghani, Nur Hamizah Abdul
    Rivaie, Mohd
    Mamat, Mustafa
    INNOVATIONS THROUGH MATHEMATICAL AND STATISTICAL RESEARCH: PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON MATHEMATICAL SCIENCES AND STATISTICS (ICMSS2016), 2016, 1739
  • [43] Global Convergence Properties of Nonlinear Conjugate Gradient Methods with Modified Secant Condition
    Hiroshi Yabe
    Masahiro Takano
    Computational Optimization and Applications, 2004, 28 : 203 - 225
  • [44] Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
    Yabe, H
    Takano, M
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2004, 28 (02) : 203 - 225
  • [45] An improved Dai–Kou conjugate gradient algorithm for unconstrained optimization
    Zexian Liu
    Hongwei Liu
    Yu-Hong Dai
    Computational Optimization and Applications, 2020, 75 : 145 - 167
  • [46] A q-CONJUGATE GRADIENT ALGORITHM FOR UNCONSTRAINED OPTIMIZATION PROBLEMS
    Lai, Kin Keung
    Mishra, Shashi Kant
    Ram, Bhagwat
    PACIFIC JOURNAL OF OPTIMIZATION, 2021, 17 (01): : 57 - 76
  • [47] A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
    Andrei, Neculai
    APPLIED MATHEMATICS LETTERS, 2007, 20 (06) : 645 - 650
  • [48] Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
    Narushima, Yasushi
    Yabe, Hiroshi
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2012, 236 (17) : 4303 - 4317
  • [49] A Modified Three-Term Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems
    Wang, Zhan
    Li, Pengyuan
    Li, Xiangrong
    Pham, Hongtruong
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020 (2020)
  • [50] Descent three-term conjugate gradient methods based on secant conditions for unconstrained optimization
    Kobayashi, Hiroshi
    Narushima, Yasushi
    Yabe, Hiroshi
    OPTIMIZATION METHODS & SOFTWARE, 2017, 32 (06): : 1313 - 1329