Inexact Reduced Gradient Methods in Nonconvex Optimization

被引:3
|
作者
Khanh, Pham Duy [1 ]
Mordukhovich, Boris S. [2 ]
Tran, Dat Ba [2 ]
机构
[1] Ho Chi Minh City Univ Educ, Dept Math, Grp Anal & Appl Math, Ho Chi Minh City, Vietnam
[2] Wayne State Univ, Dept Math, Detroit, MI 48202 USA
关键词
Nonconvex optimization; Inexact reduced gradient methods; Linesearch methods; Kurdyka-Lojasiewicz property; Convergence rates; SAMPLING ALGORITHM; DESCENT METHODS; CONVERGENCE;
D O I
10.1007/s10957-023-02319-9
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
This paper proposes and develops new linesearch methods with inexact gradient information for finding stationary points of nonconvex continuously differentiable functions on finite-dimensional spaces. Some abstract convergence results for a broad class of linesearch methods are established. A general scheme for inexact reduced gradient (IRG) methods is proposed, where the errors in the gradient approximation automatically adapt with the magnitudes of the exact gradients. The sequences of iterations are shown to obtain stationary accumulation points when different stepsize selections are employed. Convergence results with constructive convergence rates for the developed IRG methods are established under the Kurdyka-Lojasiewicz property. The obtained results for the IRG methods are confirmed by encouraging numerical experiments, which demonstrate advantages of automatically controlled errors in IRG methods over other frequently used error selections.
引用
收藏
页码:2138 / 2178
页数:41
相关论文
共 50 条
  • [31] pbSGD: Powered Stochastic Gradient Descent Methods for Accelerated Nonconvex Optimization
    Zhou, Beitong
    Liu, Jun
    Sun, Weigao
    Chen, Ruijuan
    Tomlin, Claire
    Yuan, Ye
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3258 - 3266
  • [32] New proximal bundle algorithm based on the gradient sampling method for nonsmooth nonconvex optimization with exact and inexact information
    N. Hoseini Monjezi
    S. Nobakhtian
    Numerical Algorithms, 2023, 94 : 765 - 787
  • [33] Gradient Methods With Dynamic Inexact Oracles
    Han, Shuo
    IEEE CONTROL SYSTEMS LETTERS, 2021, 5 (04): : 1163 - 1168
  • [34] SPECTRAL PROJECTED GRADIENT METHOD WITH INEXACT RESTORATION FOR MINIMIZATION WITH NONCONVEX CONSTRAINTS
    Gomes-Ruggiero, M. A.
    Martinez, J. M.
    Santos, S. A.
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2009, 31 (03): : 1628 - 1652
  • [35] New proximal bundle algorithm based on the gradient sampling method for nonsmooth nonconvex optimization with exact and inexact information
    Monjezi, N. Hoseini
    Nobakhtian, S.
    NUMERICAL ALGORITHMS, 2023, 94 (02) : 765 - 787
  • [36] Gradient Methods with Dynamic Inexact Oracles
    Han, Shuo
    2021 AMERICAN CONTROL CONFERENCE (ACC), 2021, : 941 - 946
  • [37] An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization
    Liu, Ruyu
    Pan, Shaohua
    Wu, Yuqia
    Yang, Xiaoqi
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 88 (02) : 603 - 641
  • [38] An Inexact Augmented Lagrangian Framework for Nonconvex Optimization with Nonlinear Constraints
    Sahin, Mehmet Fatih
    Eftekhari, Armin
    Alacaoglu, Ahmet
    Latorre, Fabian
    Cevher, Volkan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [39] Inexact Block Coordinate Descent Algorithms for Nonsmooth Nonconvex Optimization
    Yang, Yang
    Pesavento, Marius
    Luo, Zhi-Quan
    Ottersten, Bjorn
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 947 - 961
  • [40] Image reconstruction by nonconvex inexact half-quadratic optimization
    Robini, Marc
    Niu, Pei
    Yang, Feng
    Zhu, Yuemin
    2017 IEEE NUCLEAR SCIENCE SYMPOSIUM AND MEDICAL IMAGING CONFERENCE (NSS/MIC), 2017,