Universal Gradient Descent Ascent Method for Nonconvex-Nonconcave Minimax Optimization

被引:0
|
作者
Zheng, Taoli [1 ]
Zhu, Linglingzhi [1 ]
So, Anthony Man-Cho [1 ]
Blanchet, José [2 ]
Li, Jiajin [2 ]
机构
[1] CUHK, Hong Kong
[2] Stanford University, United States
关键词
Compendex;
D O I
暂无
中图分类号
学科分类号
摘要
Data handling
引用
收藏
页码:54075 / 54110
相关论文
共 50 条
  • [41] Dissipative Gradient Descent Ascent Method: A Control Theory Inspired Algorithm for Min-Max Optimization
    Zheng, Tianqi
    Loizou, Nicolas
    You, Pengcheng
    Mallada, Enrique
    IEEE CONTROL SYSTEMS LETTERS, 2024, 8 : 2009 - 2014
  • [42] A Communication-Efficient Stochastic Gradient Descent Algorithm for Distributed Nonconvex Optimization
    Xie, Antai
    Yi, Xinlei
    Wang, Xiaofan
    Cao, Ming
    Ren, Xiaoqiang
    2024 IEEE 18TH INTERNATIONAL CONFERENCE ON CONTROL & AUTOMATION, ICCA 2024, 2024, : 609 - 614
  • [43] STOCHASTIC ALTERNATING STRUCTURE-ADAPTED PROXIMAL GRADIENT DESCENT METHOD WITH VARIANCE REDUCTION FOR NONCONVEX NONSMOOTH OPTIMIZATION
    Jia, Zehui
    Zhang, Wenxing
    Cai, Xingju
    Han, Deren
    MATHEMATICS OF COMPUTATION, 2024, 93 (348) : 1677 - 1714
  • [44] A Descent Conjugate Gradient Method for Optimization Problems
    Semiu, Ayinde
    Idowu, Osinuga
    Adesina, Adio
    Sunday, Agboola
    Joseph, Adelodun
    Uchenna, Uka
    Olufisayo, Awe
    IAENG International Journal of Applied Mathematics, 2024, 54 (09) : 1765 - 1775
  • [45] Fast Optimistic Gradient Descent Ascent (OGDA) Method in Continuous and Discrete Time
    Bot, Radu Ioan
    Csetnek, Ernoe Robert
    Nguyen, Dang-Khoa
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2025, 25 (01) : 163 - 222
  • [46] A note on the accelerated proximal gradient method for nonconvex optimization
    Wang, Huijuan
    Xu, Hong-Kun
    CARPATHIAN JOURNAL OF MATHEMATICS, 2018, 34 (03) : 449 - 457
  • [47] Solving Min-Max Optimization with Hidden Structure via Gradient Descent Ascent
    Flokas, Lampros
    Vlatakis-Gkaragkounis, Emmanouil V.
    Piliouras, Georgios
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [48] Optimal Epoch Stochastic Gradient Descent Ascent Methods for Min-Max Optimization
    Yan, Yan
    Xu, Yi
    Lin, Qihang
    Liu, Wei
    Yang, Tianbao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [49] Analytical convergence regions of accelerated gradient descent in nonconvex optimization under Regularity Condition
    Xiong, Huaqing
    Chi, Yuejie
    Hu, Bin
    Zhang, Wei
    AUTOMATICA, 2020, 113 (113)
  • [50] Variance-reduced reshuffling gradient descent for nonconvex optimization: Centralized and distributed algorithms
    Jiang, Xia
    Zeng, Xianlin
    Xie, Lihua
    Sun, Jian
    Chen, Jie
    AUTOMATICA, 2025, 171