Adaptive Quasi-Newton and Anderson Acceleration Framework with Explicit Global (Accelerated) Convergence Rates

被引:0
|
作者
Scieur, Damien [1 ]
机构
[1] Samsung SAIL Montreal, Montreal, PQ, Canada
关键词
VECTOR EXTRAPOLATION METHODS; SUPERLINEAR CONVERGENCE; CUBIC REGULARIZATION; EPSILON-ALGORITHM; SYSTEMS; BFGS; EQUATIONS; BAD;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the impressive numerical performance of the quasi-Newton and Anderson/nonlinear acceleration methods, their global convergence rates have remained elusive for over 50 years. This study addresses this long-standing issue by introducing a framework that derives novel, adaptive quasi-Newton and nonlinear/Anderson acceleration schemes. Under mild assumptions, the proposed iterative methods exhibit explicit, non-asymptotic convergence rates that blend those of the gradient descent and Cubic Regularized Newton's methods. The proposed approach also includes an accelerated version for convex functions. Notably, these rates are achieved adaptively without prior knowledge of the function's parameters. The framework presented in this study is generic, and its special cases include algorithms such as Newton's method with random subspaces, finite differences, or lazy Hessian. Numerical experiments demonstrated the efficiency of the proposed framework, even compared to the l-BFGS algorithm with Wolfe line-search. The code used in the experiments is available on https://github.com/windows7lover/QN_With_Guarantees.
引用
收藏
页数:58
相关论文
共 50 条
  • [31] ACCELERATION OF ADAPTIVE NORMALIZED QUASI-NEWTON ALGORITHM WITH IMPROVED UPPER BOUNDS OF THE CONDITION NUMBER
    Kakimoto, Kenji
    Yamagishi, Masao
    Yamada, Isao
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4267 - 4271
  • [32] aSNAQ: An adaptive stochastic Nesterov's accelerated quasi-Newton method for training RNNs
    Sendilkkumaar, Indrapriyadarsini
    Mahboubi, Shahrzad
    Ninomiya, Hiroshi
    Asai, Hideki
    IEICE NONLINEAR THEORY AND ITS APPLICATIONS, 2020, 11 (04): : 409 - 421
  • [33] Momentum Acceleration of Quasi-Newton Training for Neural Networks
    Mahboubi, Shahrzad
    Indrapriyadarsini, S.
    Ninomiya, Hiroshi
    Asai, Hideki
    PRICAI 2019: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2019, 11671 : 268 - 281
  • [34] Analysis of a quasi-Newton adaptive filtering algorithm
    deCampos, MLR
    Antoniou, A
    ICECS 96 - PROCEEDINGS OF THE THIRD IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS, AND SYSTEMS, VOLS 1 AND 2, 1996, : 848 - 851
  • [35] A FAST QUASI-NEWTON ADAPTIVE FILTERING ALGORITHM
    MARSHALL, DF
    JENKINS, WK
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1992, 40 (07) : 1652 - 1662
  • [36] Robust Quasi-Newton Adaptive Filtering Algorithms
    Bhotto, Md. Zulfiquar Ali
    Antoniou, Andreas
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2011, 58 (08) : 537 - 541
  • [37] Adaptive Quasi-Newton Algorithm for Remote Sensing
    Wang, Jun
    Yu, Wenbo
    Huang, He
    2023 IEEE 12TH DATA DRIVEN CONTROL AND LEARNING SYSTEMS CONFERENCE, DDCLS, 2023, : 1233 - 1238
  • [38] On the global convergence of an inexact quasi-Newton conditional gradient method for constrained nonlinear systems
    Goncalves, M. L. N.
    Oliveira, F. R.
    NUMERICAL ALGORITHMS, 2020, 84 (02) : 609 - 631
  • [39] On superlinear convergence of quasi-Newton methods for nonsmooth equations
    Qi, LQ
    OPERATIONS RESEARCH LETTERS, 1997, 20 (05) : 223 - 228
  • [40] Exploiting Local Convergence of Quasi-Newton Methods Globally: Adaptive Sample Size Approach
    Jin, Qiujiang
    Mokhtari, Aryan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34