A nonmonotone accelerated proximal gradient method with variable stepsize strategy for nonsmooth and nonconvex minimization problems

被引:2
|
作者
Liu, Hongwei [1 ]
Wang, Ting [2 ]
Liu, Zexian [3 ]
机构
[1] Xidian Univ, Sch Math & Stat, Xian 710126, Peoples R China
[2] Xian Univ Posts & Telecommun, Sch Sci, Xian 710121, Peoples R China
[3] Guizhou Univ, Sch Math & Stat, Guiyang 550025, Peoples R China
基金
美国国家科学基金会;
关键词
Nonconvex; Nonsmooth; Accelerated proximal gradient method; Variable stepsize strategy; Kurdyka-Lojasiewicz property; Convergence; ALTERNATING LINEARIZED MINIMIZATION; FORWARD-BACKWARD ALGORITHM; THRESHOLDING ALGORITHM; CONVERGENCE-RATES; SELECTION;
D O I
10.1007/s10898-024-01366-4
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we consider the problem that minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting, which arising in many contemporary applications such as machine learning, statistics, and signal/image processing. To solve this problem, we propose a new nonmonotone accelerated proximal gradient method with variable stepsize strategy. Note that incorporating inertial term into proximal gradient method is a simple and efficient acceleration technique, while the descent property of the proximal gradient algorithm will lost. In our algorithm, the iterates generated by inertial proximal gradient scheme are accepted when the objective function values decrease or increase appropriately; otherwise, the iteration point is generated by proximal gradient scheme, which makes the function values on a subset of iterates are decreasing. We also introduce a variable stepsize strategy, which does not need a line search or does not need to know the Lipschitz constant and makes the algorithm easy to implement. We show that the sequence of iterates generated by the algorithm converges to a critical point of the objective function. Further, under the assumption that the objective function satisfies the Kurdyka-Lojasiewicz inequality, we prove the convergence rates of the objective function values and the iterates. Moreover, numerical results on both convex and nonconvex problems are reported to demonstrate the effectiveness and superiority of the proposed method and stepsize strategy.
引用
收藏
页码:863 / 897
页数:35
相关论文
共 50 条
  • [21] General inertial proximal gradient method for a class of nonconvex nonsmooth optimization problems
    Zhongming Wu
    Min Li
    Computational Optimization and Applications, 2019, 73 : 129 - 158
  • [22] On the global convergence of a nonmonotone proximal bundle method for convex nonsmooth minimization
    Hou, Liusheng
    Sun, Wenyu
    OPTIMIZATION METHODS & SOFTWARE, 2008, 23 (02): : 227 - 235
  • [23] Proximal variable smoothing method for three-composite nonconvex nonsmooth minimization with a linear operator
    Yuncheng Liu
    Fuquan Xia
    Numerical Algorithms, 2024, 96 : 237 - 266
  • [24] Proximal variable smoothing method for three-composite nonconvex nonsmooth minimization with a linear operator
    Liu, Yuncheng
    Xia, Fuquan
    NUMERICAL ALGORITHMS, 2024, 96 (01) : 237 - 266
  • [25] Inertial Proximal Alternating Linearized Minimization (iPALM) for Nonconvex and Nonsmooth Problems
    Pock, Thomas
    Sabach, Shoham
    SIAM JOURNAL ON IMAGING SCIENCES, 2016, 9 (04): : 1756 - 1787
  • [26] On the Linear Convergence of a Proximal Gradient Method for a Class of Nonsmooth Convex Minimization Problems
    Zhang, Haibin
    Jiang, Jiaojiao
    Luo, Zhi-Quan
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2013, 1 (02) : 163 - 186
  • [27] A Stochastic Proximal Alternating Minimization for Nonsmooth and Nonconvex
    Driggs, Derek
    Tang, Junqi
    Liang, Jingwei
    Davies, Mike
    Schonlieb, Carola-Bibiane
    SIAM JOURNAL ON IMAGING SCIENCES, 2021, 14 (04): : 1932 - 1970
  • [28] A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization
    Li, Zhize
    Li, Jian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [29] Proximal gradient method for nonconvex and nonsmooth optimization on Hadamard manifolds
    Feng, Shuailing
    Huang, Wen
    Song, Lele
    Ying, Shihui
    Zeng, Tieyong
    OPTIMIZATION LETTERS, 2022, 16 (08) : 2277 - 2297
  • [30] Proximal gradient method for nonconvex and nonsmooth optimization on Hadamard manifolds
    Shuailing Feng
    Wen Huang
    Lele Song
    Shihui Ying
    Tieyong Zeng
    Optimization Letters, 2022, 16 : 2277 - 2297