Some remarks on conjugate gradient methods without line search

被引:1
|
作者
Wang, Cheng-jing [1 ]
机构
[1] Zhejiang Univ, Dept Math, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
conjugate gradient methods; without line search; fixed" steplength; minimizing method; limited memory BFGS method;
D O I
10.1016/j.amc.2006.01.040
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The conjugate gradient method is widely used in unconstrained optimization, especially in case of large-scale problems. However, the line search in the conjugate gradient method is sometimes very difficult or prohibitively expensive. In Sun and Zhang [J. Sun, J.P. Zhang, Global convergence of conjugate gradient methods without line search, Annals of Operations Research 103 (2001) 161-173], it is shown that by taking a "fixed" steplength alpha(k) defined by the formula alpha(k) = -delta g(k)(T)d(k)/d(k)(T)Q(k)d(k), I the conjugate gradient method is globally convergent for several popular choices of beta(k) without line search. In the simplest case all Q(k) could be identity matrices. However, it would not even guarantee the descent property. In this paper, we study some methods to select Qk, which are based on the amount of descent and are superior to taking Q(k) equivalent to I (the unit matrix). (c) 2006 Elsevier Inc. All rights reserved.
引用
收藏
页码:370 / 379
页数:10
相关论文
共 50 条