The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization

被引:0
|
作者
Kovalev, Dmitry [1 ]
Gasnikov, Alexander [2 ,3 ,4 ]
机构
[1] King Abdullah Univ Sci & Technol, Thuwal, Saudi Arabia
[2] Moscow Inst Phys & Technol, Dolgoprudnyi, Russia
[3] RAS, Res Ctr Trusted Artificial Intelligence, Inst Syst Programming, Moscow, Russia
[4] Natl Res Univ Higher Sch Econ, Moscow, Russia
关键词
REGULARIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we revisit the smooth and strongly-convex-strongly-concave minimax optimization problem. Zhang et al. (2021) and Ibrahim et al. (2020) established the lower bound Omega(root kappa(x) kappa(y) log 1/epsilon) on the number of gradient evaluations required to find an epsilon-accurate solution, where kappa(x) and kappa(y) are condition numbers for the strong convexity and strong concavity assumptions. However, the existing state-of-the-art methods do not match this lower bound: algorithms of Lin et al. (2020) and Wang and Li (2020) have gradient evaluation complexity O(root kappa(x) kappa(y) log(3) 1/epsilon) and O(root kappa(x) kappa(y) log(3) (kappa(x) kappa(y)) log 1/epsilon), respectively. We fix this fundamental issue by providing the first algorithm with O(root kappa(x) kappa(y) log 1/epsilon) gradient evaluation complexity. We design our algorithm in three steps: (i) we reformulate the original problem as a minimization problem via the pointwise conjugate function; (ii) we apply a specific variant of the proximal point algorithm to the reformulated problem; (iii) we compute the proximal operator inexactly using the optimal algorithm for operator norm reduction in monotone inclusions.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Linear convergence of first order methods for non-strongly convex optimization
    Necoara, I.
    Nesterov, Yu.
    Glineur, F.
    MATHEMATICAL PROGRAMMING, 2019, 175 (1-2) : 69 - 107
  • [42] Robustness of Accelerated First-Order Algorithms for Strongly Convex Optimization Problems
    Mohammadi, Hesameddin
    Razaviyayn, Meisam
    Jovanovic, Mihailo R.
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2021, 66 (06) : 2480 - 2495
  • [43] Alternating minimization methods for strongly convex optimization
    Tupitsa, Nazarii
    Dvurechensky, Pavel
    Gasnikov, Alexander
    Guminov, Sergey
    JOURNAL OF INVERSE AND ILL-POSED PROBLEMS, 2021, 29 (05): : 721 - 739
  • [44] Distributed Saddle Point Problems for Strongly Concave-Convex Functions
    Qureshi, Muhammad I.
    Khan, Usman A.
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2023, 9 : 679 - 690
  • [45] Saddle point for differential games with strongly convex-concave integrand
    Ivanov, GE
    MATHEMATICAL NOTES, 1997, 62 (5-6) : 607 - 622
  • [46] Saddle point for differential games with strongly convex-concave integrand
    G. E. Ivanov
    Mathematical Notes, 1997, 62 : 607 - 622
  • [47] Online strongly convex optimization with unknown delays
    Wan, Yuanyu
    Wei-Wei Tu
    Zhang, Lijun
    MACHINE LEARNING, 2022, 111 (03) : 871 - 893
  • [48] Online strongly convex optimization with unknown delays
    Yuanyu Wan
    Wei-Wei Tu
    Lijun Zhang
    Machine Learning, 2022, 111 : 871 - 893
  • [49] k-super-strongly convex and k-super-strongly smooth Banach spaces
    Suyalatu
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2004, 298 (01) : 45 - 56
  • [50] Beyond the regret minimization barrier: Optimal algorithms for stochastic strongly-convex optimization
    Hazan, Elad
    Kale, Satyen
    Journal of Machine Learning Research, 2014, 15 : 2489 - 2512