The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization

被引:0
|
作者
Kovalev, Dmitry [1 ]
Gasnikov, Alexander [2 ,3 ,4 ]
机构
[1] King Abdullah Univ Sci & Technol, Thuwal, Saudi Arabia
[2] Moscow Inst Phys & Technol, Dolgoprudnyi, Russia
[3] RAS, Res Ctr Trusted Artificial Intelligence, Inst Syst Programming, Moscow, Russia
[4] Natl Res Univ Higher Sch Econ, Moscow, Russia
关键词
REGULARIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we revisit the smooth and strongly-convex-strongly-concave minimax optimization problem. Zhang et al. (2021) and Ibrahim et al. (2020) established the lower bound Omega(root kappa(x) kappa(y) log 1/epsilon) on the number of gradient evaluations required to find an epsilon-accurate solution, where kappa(x) and kappa(y) are condition numbers for the strong convexity and strong concavity assumptions. However, the existing state-of-the-art methods do not match this lower bound: algorithms of Lin et al. (2020) and Wang and Li (2020) have gradient evaluation complexity O(root kappa(x) kappa(y) log(3) 1/epsilon) and O(root kappa(x) kappa(y) log(3) (kappa(x) kappa(y)) log 1/epsilon), respectively. We fix this fundamental issue by providing the first algorithm with O(root kappa(x) kappa(y) log 1/epsilon) gradient evaluation complexity. We design our algorithm in three steps: (i) we reformulate the original problem as a minimization problem via the pointwise conjugate function; (ii) we apply a specific variant of the proximal point algorithm to the reformulated problem; (iii) we compute the proximal operator inexactly using the optimal algorithm for operator norm reduction in monotone inclusions.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] STRONGLY CONVEX PROBLEMS OF OPTIMAL-CONTROL
    LANGENBACH, A
    MATHEMATISCHE NACHRICHTEN, 1977, 77 : 353 - 360
  • [32] Variance Reduced EXTRA and DIGing and Their Optimal Acceleration for Strongly Convex Decentralized Optimization
    Li, Huan
    Lin, Zhouchen
    Fang, Yongchun
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [33] An Extended Gradient Method for Smooth and Strongly Convex Functions
    Zhang, Xuexue
    Liu, Sanyang
    Zhao, Nannan
    MATHEMATICS, 2023, 11 (23)
  • [34] An extended delayed weighted gradient algorithm for solving strongly convex optimization problems
    Andreani, R.
    Oviedo, H.
    Raydan, M.
    Secchin, L. D.
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2022, 416
  • [35] A distributed stochastic first-order method for strongly concave-convex saddle point problems
    Qureshi, Muhammad, I
    Khan, Usman A.
    2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 4170 - 4175
  • [36] Stochastic Strongly Convex Optimization via Distributed Epoch Stochastic Gradient Algorithm
    Yuan, Deming
    Ho, Daniel W. C.
    Xu, Shengyuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (06) : 2344 - 2357
  • [37] Gradient Projection Algorithm for Strongly Convex Set
    Golubev, M. O.
    IZVESTIYA SARATOVSKOGO UNIVERSITETA NOVAYA SERIYA-MATEMATIKA MEKHANIKA INFORMATIKA, 2013, 13 (01): : 33 - 38
  • [38] Acceleration in First Order Quasi-strongly Convex Optimization by ODE Discretization
    Zhang, Jingzhao
    Sra, Suvrit
    Jadbabaie, Ali
    2019 IEEE 58TH CONFERENCE ON DECISION AND CONTROL (CDC), 2019, : 1501 - 1506
  • [39] On a class of Hamiltonian strongly degenerate elliptic systems with concave and convex nonlinearities
    Cung The Anh
    Lee, Jihoon
    Bui Kim My
    COMPLEX VARIABLES AND ELLIPTIC EQUATIONS, 2020, 65 (04) : 648 - 671
  • [40] Linear convergence of first order methods for non-strongly convex optimization
    I. Necoara
    Yu. Nesterov
    F. Glineur
    Mathematical Programming, 2019, 175 : 69 - 107