An extended delayed weighted gradient algorithm for solving strongly convex optimization problems

被引:1
|
作者
Andreani, R. [1 ]
Oviedo, H. [2 ]
Raydan, M. [3 ]
Secchin, L. D. [4 ]
机构
[1] Univ Estadual Campinas, Dept Appl Math, IMECC UNICAMP, BR-13083859 Campinas, SP, Brazil
[2] Fundacao Getulio Vargas FGV EMAp, Escola Matemat Aplicada, Rio De Janeiro, RJ, Brazil
[3] FCT NOVA, Ctr Math & Applicat NovaMath, P-2829516 Caparica, Portugal
[4] Univ Fed Espirito Santo, Dept Appl Math, Rodovia BR 101,Km 60, BR-29932540 Sao Mateus, ES, Brazil
基金
巴西圣保罗研究基金会;
关键词
Gradient methods; Conjugate gradient methods; Strongly convex functions; Large-scale optimization; BARZILAI; SOFTWARE;
D O I
10.1016/j.cam.2022.114525
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The recently developed delayed weighted gradient method (DWGM) is competitive with the well-known conjugate gradient (CG) method for the minimization of strictly convex quadratic functions. As well as the CG method, DWGM has some key optimality and orthogonality properties that justify its practical performance. The main difference with the CG method is that, instead of minimizing the objective function on the entire explored subspace, DWGM minimizes the 2-norm of the gradient vector on the same subspace. The main purpose of this study is to extend DWGM for solving strongly convex nonquadratic minimization problems while keeping a low computational cost per iteration. We incorporate the scheme into a tolerant line search globalization strategy, and we show that it exhibits q-linear convergence to the unique global solution. We compare the proposed extended DWGM with state-of-the-art methods for large-scale unconstrained minimization problems. We use some well-known strongly convex test problems, but also solve some regularized logistic regression problems that appear in machine learning. Our numerical results illustrate that the proposed scheme is promising and exhibits a fast convergence behavior. Moreover, we show through numerical experiments on CUTEst problems that the proposed extended DWGM can be very effective in accelerating the convergence of a well-established Barzilai-Borwein-type method when the iterates get close to minimizers of non-convex functions. (C) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:19
相关论文
共 50 条