Global Convergence of a Modified Gradient Projection Method for Convex Constrained Problems

被引:1
|
作者
Qing-ying Sun~1 Chang-yu Wang~2 Zhen-jun Shi~21 School of Mathematics and Computational Sciences
机构
基金
中国国家自然科学基金;
关键词
Nonlinear programming; projection; generalized Armijo step size rule; convergence;
D O I
暂无
中图分类号
O224 [最优化的数学理论];
学科分类号
摘要
In this paper,the continuously differentiable optimization problem min{f(x):x∈Ω},whereΩ∈R~n is a nonempty closed convex set,the gradient projection method by Calamai and More (Math.Pro-gramming,Vol.39.P.93-116,1987) is modified by memory gradient to improve the convergence rate of thegradient projection method is considered.The convergence of the new method is analyzed without assumingthat the iteration sequence{x~k}of bounded.Moreover,it is shown that,when f(x) is pseudo-convex (quasi-convex) function,this new method has strong convergence results.The numerical results show that the methodin this paper is more effective than the gradient projection method.
引用
收藏
页码:227 / 242
页数:16
相关论文
共 50 条