A nonmonotone accelerated proximal gradient method with variable stepsize strategy for nonsmooth and nonconvex minimization problems

被引:2
|
作者
Liu, Hongwei [1 ]
Wang, Ting [2 ]
Liu, Zexian [3 ]
机构
[1] Xidian Univ, Sch Math & Stat, Xian 710126, Peoples R China
[2] Xian Univ Posts & Telecommun, Sch Sci, Xian 710121, Peoples R China
[3] Guizhou Univ, Sch Math & Stat, Guiyang 550025, Peoples R China
基金
美国国家科学基金会;
关键词
Nonconvex; Nonsmooth; Accelerated proximal gradient method; Variable stepsize strategy; Kurdyka-Lojasiewicz property; Convergence; ALTERNATING LINEARIZED MINIMIZATION; FORWARD-BACKWARD ALGORITHM; THRESHOLDING ALGORITHM; CONVERGENCE-RATES; SELECTION;
D O I
10.1007/s10898-024-01366-4
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we consider the problem that minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting, which arising in many contemporary applications such as machine learning, statistics, and signal/image processing. To solve this problem, we propose a new nonmonotone accelerated proximal gradient method with variable stepsize strategy. Note that incorporating inertial term into proximal gradient method is a simple and efficient acceleration technique, while the descent property of the proximal gradient algorithm will lost. In our algorithm, the iterates generated by inertial proximal gradient scheme are accepted when the objective function values decrease or increase appropriately; otherwise, the iteration point is generated by proximal gradient scheme, which makes the function values on a subset of iterates are decreasing. We also introduce a variable stepsize strategy, which does not need a line search or does not need to know the Lipschitz constant and makes the algorithm easy to implement. We show that the sequence of iterates generated by the algorithm converges to a critical point of the objective function. Further, under the assumption that the objective function satisfies the Kurdyka-Lojasiewicz inequality, we prove the convergence rates of the objective function values and the iterates. Moreover, numerical results on both convex and nonconvex problems are reported to demonstrate the effectiveness and superiority of the proposed method and stepsize strategy.
引用
收藏
页码:863 / 897
页数:35
相关论文
共 50 条
  • [31] Proximal Linearized Minimization Algorithm for Nonsmooth Nonconvex Minimization Problems in Image Deblurring with Impulse Noise
    Shirong DENG
    Yuchao TANG
    JournalofMathematicalResearchwithApplications, 2024, 44 (01) : 122 - 142
  • [32] A note on the accelerated proximal gradient method for nonconvex optimization
    Wang, Huijuan
    Xu, Hong-Kun
    CARPATHIAN JOURNAL OF MATHEMATICS, 2018, 34 (03) : 449 - 457
  • [33] An approximation proximal gradient algorithm for nonconvex-linear minimax problems with nonconvex nonsmooth terms
    He, Jiefei
    Zhang, Huiling
    Xu, Zi
    JOURNAL OF GLOBAL OPTIMIZATION, 2024, 90 (01) : 73 - 92
  • [34] An inexact proximal gradient algorithm with extrapolation for a class of nonconvex nonsmooth optimization problems
    Jia, Zehui
    Wu, Zhongming
    Dong, Xiaomei
    JOURNAL OF INEQUALITIES AND APPLICATIONS, 2019, 2019 (1)
  • [35] An inexact proximal gradient algorithm with extrapolation for a class of nonconvex nonsmooth optimization problems
    Zehui Jia
    Zhongming Wu
    Xiaomei Dong
    Journal of Inequalities and Applications, 2019
  • [36] VARIABLE METRIC PROXIMAL STOCHASTIC VARIANCE REDUCED GRADIENT METHODS FOR NONCONVEX NONSMOOTH OPTIMIZATION
    Yu, Tengteng
    Liu, Xin-wei
    Dai, Yu-hong
    Sun, J. I. E.
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2022, 18 (04) : 2611 - 2631
  • [37] Asynchronous delay-aware accelerated proximal coordinate descent for nonconvex nonsmooth problems
    Kazemi, Ehsan
    Wang, Liqiang
    33rd AAAI Conference on Artificial Intelligence, AAAI 2019, 31st Innovative Applications of Artificial Intelligence Conference, IAAI 2019 and the 9th AAAI Symposium on Educational Advances in Artificial Intelligence, EAAI 2019, 2019, : 1528 - 1535
  • [38] Asynchronous Delay-Aware Accelerated Proximal Coordinate Descent for Nonconvex Nonsmooth Problems
    Kazemi, Ehsan
    Wang, Liqiang
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1528 - 1535
  • [39] Nonmonotone Proximal Gradient Method for Composite Multiobjective Optimization Problems
    Jian-Wen Peng
    Hua Sun
    Elisabeth Köbis
    Journal of Optimization Theory and Applications, 2025, 205 (3)
  • [40] A proximal bundle method for a class of nonconvex nonsmooth composite optimization problems
    Pang, Liping
    Wang, Xiaoliang
    Meng, Fanyun
    JOURNAL OF GLOBAL OPTIMIZATION, 2023, 86 (03) : 589 - 620