A CLASS OF REDUCED GRADIENT METHODS FOR HANDLING OPTIMIZATION PROBLEMS WITH LINEAR INEQUALITY CONSTRAINTS

被引:0
|
作者
徐成贤
魏斌
机构
[1] Department of Mathematics
[2] Xi'an Jiaotong University
[3] Xi'an
[4] Xi'an
关键词
Nonlinear Programming; Reduced Gradient Method; Global Convergence;
D O I
10.13299/j.cnki.amjcu.000440
中图分类号
学科分类号
摘要
<正> A class of reduced gradient methods for handling general optimization problems with linear equality and inequality constraints is suggested in this paper. Although a slack vector is introduced, the dimension of the problem is not increased, which is unlike the conventional way of transferring the inequality constraints into the equality constraints by introducing slack variables. When an iterate x(k) is not a K-T point of the problem under consideration, different feasible descent directions can be obtained by different choices of the slack vectors. The suggested method is globally convergent and the numerical experiment given in the paper shows that the method is efficient.
引用
收藏
页码:327 / 338
页数:12
相关论文
共 1 条
  • [1] A convergence theorem of Rosen’s gradient projection method[J] . Ding-Zhu Du,Xiang-Sung Zhang.Mathematical Programming . 1986 (2)