<正> A class of reduced gradient methods for handling general optimization problems with linear equality and inequality constraints is suggested in this paper. Although a slack vector is introduced, the dimension of the problem is not increased, which is unlike the conventional way of transferring the inequality constraints into the equality constraints by introducing slack variables. When an iterate x(k) is not a K-T point of the problem under consideration, different feasible descent directions can be obtained by different choices of the slack vectors. The suggested method is globally convergent and the numerical experiment given in the paper shows that the method is efficient.