Inexact Reduced Gradient Methods in Nonconvex Optimization

被引:3
|
作者
Khanh, Pham Duy [1 ]
Mordukhovich, Boris S. [2 ]
Tran, Dat Ba [2 ]
机构
[1] Ho Chi Minh City Univ Educ, Dept Math, Grp Anal & Appl Math, Ho Chi Minh City, Vietnam
[2] Wayne State Univ, Dept Math, Detroit, MI 48202 USA
关键词
Nonconvex optimization; Inexact reduced gradient methods; Linesearch methods; Kurdyka-Lojasiewicz property; Convergence rates; SAMPLING ALGORITHM; DESCENT METHODS; CONVERGENCE;
D O I
10.1007/s10957-023-02319-9
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
This paper proposes and develops new linesearch methods with inexact gradient information for finding stationary points of nonconvex continuously differentiable functions on finite-dimensional spaces. Some abstract convergence results for a broad class of linesearch methods are established. A general scheme for inexact reduced gradient (IRG) methods is proposed, where the errors in the gradient approximation automatically adapt with the magnitudes of the exact gradients. The sequences of iterations are shown to obtain stationary accumulation points when different stepsize selections are employed. Convergence results with constructive convergence rates for the developed IRG methods are established under the Kurdyka-Lojasiewicz property. The obtained results for the IRG methods are confirmed by encouraging numerical experiments, which demonstrate advantages of automatically controlled errors in IRG methods over other frequently used error selections.
引用
收藏
页码:2138 / 2178
页数:41
相关论文
共 50 条
  • [41] An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization
    Ruyu Liu
    Shaohua Pan
    Yuqia Wu
    Xiaoqi Yang
    Computational Optimization and Applications, 2024, 88 : 603 - 641
  • [42] Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
    Wu, Zhongming
    Li, Chongshou
    Li, Min
    Lim, Andrew
    JOURNAL OF GLOBAL OPTIMIZATION, 2021, 79 (03) : 617 - 644
  • [43] Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
    Zhongming Wu
    Chongshou Li
    Min Li
    Andrew Lim
    Journal of Global Optimization, 2021, 79 : 617 - 644
  • [44] Faster Gradient-Free Proximal Stochastic Methods for Nonconvex Nonsmooth Optimization
    Huang, Feihu
    Gu, Bin
    Huo, Zhouyuan
    Chen, Songcan
    Huang, Heng
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1503 - 1510
  • [45] Nonconvex optimization: Gradient flows and deformation
    Jongen H.Th.
    Stein O.
    Journal of Dynamical and Control Systems, 2001, 7 (03) : 425 - 446
  • [46] Variance-reduced reshuffling gradient descent for nonconvex optimization: Centralized and distributed algorithms
    Jiang, Xia
    Zeng, Xianlin
    Xie, Lihua
    Sun, Jian
    Chen, Jie
    AUTOMATICA, 2025, 171
  • [47] Inexact primal-dual gradient projection methods for nonlinear optimization on convex set
    Zhang, Fan
    Wang, Hao
    Wang, Jiashan
    Yang, Kai
    OPTIMIZATION, 2020, 69 (10) : 2339 - 2365
  • [48] Adaptive Methods for Nonconvex Optimization
    Zaheer, Manzil
    Reddi, Sashank J.
    Sachan, Devendra
    Kale, Satyen
    Kumar, Sanjiv
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [49] Inexact Proximal Gradient Methods for Non-Convex and Non-Smooth Optimization
    Gu, Bin
    Wang, De
    Huo, Zhouyuan
    Huang, Heng
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3093 - 3100
  • [50] ACCELERATED METHODS FOR NONCONVEX OPTIMIZATION
    Carmon, Yair
    Duchi, John C.
    Hinder, Oliver
    Sidford, Aaron
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) : 1751 - 1772