Adaptive Quasi-Newton and Anderson Acceleration Framework with Explicit Global (Accelerated) Convergence Rates

被引:0
|
作者
Scieur, Damien [1 ]
机构
[1] Samsung SAIL Montreal, Montreal, PQ, Canada
关键词
VECTOR EXTRAPOLATION METHODS; SUPERLINEAR CONVERGENCE; CUBIC REGULARIZATION; EPSILON-ALGORITHM; SYSTEMS; BFGS; EQUATIONS; BAD;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the impressive numerical performance of the quasi-Newton and Anderson/nonlinear acceleration methods, their global convergence rates have remained elusive for over 50 years. This study addresses this long-standing issue by introducing a framework that derives novel, adaptive quasi-Newton and nonlinear/Anderson acceleration schemes. Under mild assumptions, the proposed iterative methods exhibit explicit, non-asymptotic convergence rates that blend those of the gradient descent and Cubic Regularized Newton's methods. The proposed approach also includes an accelerated version for convex functions. Notably, these rates are achieved adaptively without prior knowledge of the function's parameters. The framework presented in this study is generic, and its special cases include algorithms such as Newton's method with random subspaces, finite differences, or lazy Hessian. Numerical experiments demonstrated the efficiency of the proposed framework, even compared to the l-BFGS algorithm with Wolfe line-search. The code used in the experiments is available on https://github.com/windows7lover/QN_With_Guarantees.
引用
收藏
页数:58
相关论文
共 50 条
  • [1] Explicit Convergence Rates of Greedy and Random Quasi-Newton Methods
    Lin, Dachao
    Ye, Haishan
    Zhang, Zhihua
    Journal of Machine Learning Research, 2022, 23
  • [2] Explicit Convergence Rates of Greedy and Random Quasi-Newton Methods
    Lin, Dachao
    Ye, Haishan
    Zhang, Zhihua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [3] GREEDY QUASI-NEWTON METHODS WITH EXPLICIT SUPERLINEAR CONVERGENCE
    Rodomanov, Anton
    Nesterov, Yurii
    SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (01) : 785 - 811
  • [4] Adaptive Greedy Quasi-Newton with Superlinear Rate and Global Convergence Guarantee
    Du, Yubo
    You, Keyou
    2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 7606 - 7611
  • [5] QUASI-NEWTON METHODS FOR CONVERGENCE ACCELERATION OF CYCLIC SYSTEMS
    SOLIMAN, MA
    CANADIAN JOURNAL OF CHEMICAL ENGINEERING, 1979, 57 (05): : 643 - 647
  • [6] Rates of superlinear convergence for classical quasi-Newton methods
    Anton Rodomanov
    Yurii Nesterov
    Mathematical Programming, 2022, 194 : 159 - 190
  • [7] Rates of superlinear convergence for classical quasi-Newton methods
    Rodomanov, Anton
    Nesterov, Yurii
    MATHEMATICAL PROGRAMMING, 2022, 194 (1-2) : 159 - 190
  • [8] Global convergence of quasi-Newton methods for unconstrained optimization
    Han, LX
    Liu, GH
    CHINESE SCIENCE BULLETIN, 1996, 41 (07): : 529 - 533
  • [9] Global convergence of quasi-Newton methods for unconstrained optimization
    韩立兴
    刘光辉
    ChineseScienceBulletin, 1996, (07) : 529 - 533
  • [10] Distributed adaptive greedy quasi-Newton methods with explicit non-asymptotic convergence bounds
    Du, Yubo
    You, Keyou
    AUTOMATICA, 2024, 165