ACCELERATING THE HYBRID STEEPEST DESCENT METHOD FOR AFFINELY CONSTRAINED CONVEX COMPOSITE MINIMIZATION TASKS

被引:0
|
作者
Slavakis, Konstantinos [1 ]
Yamada, Isao [2 ]
Ono, Shunsuke [3 ]
机构
[1] SUNY Buffalo, Dept Elect Engn, Buffalo, NY 14260 USA
[2] Tokyo Inst Technol, Dept Inform & Commun Engn, Tokyo 1528550, Japan
[3] Tokyo Inst Technol, Lab Future Interdisciplinary Res Sci & Tech, Yokohama, Kanagawa 2268503, Japan
基金
美国国家科学基金会;
关键词
Composite optimization; convexity; nonexpansive mappings; hybrid steepest descent method; variational-inequality problem; FIXED-POINT SET; OPTIMIZATION; ALGORITHM; DIRECTION;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
The hybrid steepest descent method (HSDM) [Yamada, '01] was introduced as a low-computational complexity tool for solving convex variational-inequality problems over the fixed-point set of nonexpansive mappings in Hilbert spaces. Motivated by results on decentralized optimization, this study introduces an HSDM variant that extends, for the first time, the applicability of HSDM to affinely constrained composite convex minimization tasks over Euclidean spaces; the same class of problems solved by the popular alternating direction method of multipliers and primal-dual methods. The proposed scheme shows desirable attributes for large-scale optimization tasks that have not been met, partly or all-together, in any other member of the HSDM family of algorithms: tunable computational complexity, a step-size parameter which stays constant over recursions, promoting thus acceleration of convergence, no boundedness constraints on iterates and/or gradients, and the ability to deal with convex losses which comprise a smooth and a non-smooth part, where the smooth part is only required to have a Lipschitz-continuous derivative. Convergence guarantees and rates are established. Numerical tests on synthetic data and on colored-image inpainting underline the rich potential of the proposed scheme for large-scale optimization tasks.
引用
收藏
页码:4711 / 4715
页数:5
相关论文
共 50 条