A self-scaling memoryless BFGS based conjugate gradient method using multi-step secant condition for unconstrained minimization

被引:0
|
作者
Kim, Yongjin [1 ]
Jong, Yunchol [1 ]
Kim, Yong [1 ]
机构
[1] Univ Sci, Dept Math, Unjong Dist 355, Pyongyang 950003, North Korea
关键词
unconstrained optimization; conjugate gradient method; multi-step secant condition; self-scaling; improved Wolfe line search; QUASI-NEWTON METHODS; GLOBAL CONVERGENCE; DESCENT; ALGORITHM; PROPERTY;
D O I
10.21136/AM.2024.0204-23
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. Based on the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method, new conjugate gradient algorithms CG-DESCENT and CGOPT have been proposed by W. Hager, H. Zhang (2005) and Y. Dai, C. Kou (2013), respectively. It is noted that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. Therefore, C. Kou, Y. Dai (2015) proposed some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. For the sake of improvement of modified SSML-BFGS method, in this paper, we present an efficient SSML-BFGS-type three-term conjugate gradient method for solving unconstrained minimization using Ford-Moghrabi secant equation instead of the usual secant equations. The method is shown to be globally convergent under certain assumptions. Numerical results compared with methods using the usual secant equations are reported.
引用
收藏
页码:847 / 866
页数:20
相关论文
共 50 条