On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems

被引:2
|
作者
Wen, Bo [1 ,2 ]
Xue, Xiaoping [2 ,3 ]
机构
[1] Hebei Univ Technol, Inst Math, Tianjin, Peoples R China
[2] Harbin Inst Technol, Dept Math, Harbin, Heilongjiang, Peoples R China
[3] Harbin Inst Technol, Inst Adv Study Math, Harbin, Heilongjiang, Peoples R China
关键词
Convex minimization; Proximal gradient algorithm; Extrapolation; Lojasiewicz inequality; Convergence;
D O I
10.1007/s10898-019-00789-8
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we consider the proximal gradient algorithm with extrapolation for solving a class of convex nonsmooth minimization problems. We show that for a large class of extrapolation parameters including the extrapolation parameters chosen in FISTA (Beck and Teboulle in SIAM J Imaging Sci 2:183-202, 2009), the successive changes of iterates go to 0. Moreover, based on the Lojasiewicz inequality, we establish the global convergence of iterates generated by the proximal gradient algorithm with extrapolation with an additional assumption on the extrapolation coefficients. The assumption is general enough to allow the threshold of the extrapolation coefficients to be 1. In particular, we prove the length of the iterates is finite. Finally, we perform numerical experiments on the least squares problems with l(1) regularization to illustrate our theoretical results.
引用
收藏
页码:767 / 787
页数:21
相关论文
共 50 条