Adaptive First-Order Methods Revisited: Convex Optimization without Lipschitz Requirements

被引:0
|
作者
Antonakopoulos, Kimon [1 ]
Mertikopoulos, Panayotis
机构
[1] Univ Grenoble Alpes, CNRS, INRIA, Grenoble INP,LIG, F-38000 Grenoble, France
关键词
GRADIENT CONTINUITY; SUBGRADIENT METHODS; DESCENT;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a new family of adaptive first-order methods for a class of convex minimization problems that may fail to be Lipschitz continuous or smooth. Specifically, motivated by a recent flurry of activity on non-Lipschitz (NoLips) optimization, we consider problems that are continuous or smooth relative to a reference Bregman function - as opposed to a global, ambient norm (Euclidean or otherwise). These conditions encompass a wide range of problems with singular objectives that cannot be treated with standard first-order methods for Lipschitz continuous/smooth problems - such as Fisher markets, Poisson tomography problems, D-optimal design, and the like. In this setting, the application of existing order-optimal adaptive methods - like UNIXGRAD or ACCELEGRAD - is not possible, especially in the presence of randomness and uncertainty. The proposed method - which we call adaptive mirror descent (ADAMIR) - aims to close this gap by concurrently achieving min-max optimal rates in problems that are relatively continuous or smooth, including stochastic ones.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] ACCELERATED FIRST-ORDER METHODS FOR CONVEX OPTIMIZATION WITH LOCALLY LIPSCHITZ CONTINUOUS GRADIENT
    Lu, Zhaosong
    Mei, Sanyou
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) : 2275 - 2310
  • [2] First-Order Methods for Convex Optimization
    Dvurechensky, Pavel
    Shtern, Shimrit
    Staudigl, Mathias
    EURO JOURNAL ON COMPUTATIONAL OPTIMIZATION, 2021, 9
  • [3] An adaptive accelerated first-order method for convex optimization
    Renato D. C. Monteiro
    Camilo Ortiz
    Benar F. Svaiter
    Computational Optimization and Applications, 2016, 64 : 31 - 73
  • [4] An adaptive accelerated first-order method for convex optimization
    Monteiro, Renato D. C.
    Ortiz, Camilo
    Svaiter, Benar F.
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2016, 64 (01) : 31 - 73
  • [5] Avoiding Synchronization in First-Order Methods for Sparse Convex Optimization
    Devarakonda, Aditya
    Demmel, James
    Fountoulakis, Kimon
    Mahoney, Michael W.
    2018 32ND IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS), 2018, : 409 - 418
  • [6] Fast First-Order Methods for Composite Convex Optimization with Backtracking
    Scheinberg, Katya
    Goldfarb, Donald
    Bai, Xi
    FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2014, 14 (03) : 389 - 417
  • [7] RELATIVELY SMOOTH CONVEX OPTIMIZATION BY FIRST-ORDER METHODS, AND APPLICATIONS
    Lu, Haihao
    Freund, Robert M.
    Nesterov, Yurii
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (01) : 333 - 354
  • [8] Fast First-Order Methods for Composite Convex Optimization with Backtracking
    Katya Scheinberg
    Donald Goldfarb
    Xi Bai
    Foundations of Computational Mathematics, 2014, 14 : 389 - 417
  • [9] First-order methods of smooth convex optimization with inexact oracle
    Olivier Devolder
    François Glineur
    Yurii Nesterov
    Mathematical Programming, 2014, 146 : 37 - 75
  • [10] First-order methods of smooth convex optimization with inexact oracle
    Devolder, Olivier
    Glineur, Francois
    Nesterov, Yurii
    MATHEMATICAL PROGRAMMING, 2014, 146 (1-2) : 37 - 75