An Optimal Algorithm for Bandit Convex Optimization with Strongly-Convex and Smooth Loss

被引:0
|
作者
Ito, Shinji [1 ]
机构
[1] Univ Tokyo, NEC Corp, Tokyo, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider non-stochastic bandit convex optimization with strongly-convex and smooth loss functions. For this problem, Hazan and Levy have proposed an algorithm with a regret bound of (O) over tilde (d(3/2) root T) given access to an O(d)-self-concordant barrier over the feasible region, where d and T stand for the dimensionality of the feasible region and the number of rounds, respectively. However, there are no known efficient ways for constructing self-concordant barriers for general convex sets, and a (O) over tilde(root d) gap has remained between the upper and lower bounds, as the known regret lower bound is Omega(d root T). Our study resolves these two issues by introducing an algorithm that achieves an optimal regret bound of (O) over tilde (d root T) under a mild assumption, without self-concordant barriers. More precisely, the algorithm requires only a membership oracle for the feasible region, and it achieves an optimal regret bound of (O) over tilde (d root T) under the assumption that the optimal solution is an interior of the feasible region. Even without this assumption, our algorithm achieves (O) over tilde (d(3/2) root T)-regret.
引用
收藏
页码:2229 / 2238
页数:10
相关论文
共 50 条
  • [41] Near-optimal method for highly smooth convex optimization
    Bubeck, Sebastien
    Jiang, Qijia
    Lee, Yin Tat
    Li, Yuanzhi
    Sidford, Aaron
    CONFERENCE ON LEARNING THEORY, VOL 99, 2019, 99
  • [42] Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks
    Kovalev, Dmitry
    Gasanov, Elnur
    Gasnikov, Alexander
    Richtarik, Peter
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [43] On the oracle complexity of smooth strongly convex minimization
    Drori, Yoel
    Taylor, Adrien
    JOURNAL OF COMPLEXITY, 2022, 68
  • [44] About the Gradient Projection Algorithm for a Strongly Convex Function and a Proximally Smooth Set
    Balashov, Maxim V.
    JOURNAL OF CONVEX ANALYSIS, 2017, 24 (02) : 493 - 500
  • [45] (Bandit) Convex Optimization with Biased Noisy Gradient Oracles
    Hu, Xiaowei
    Prashanth, L. A.
    Gyorgy, Andras
    Szepesvari, Csaba
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 819 - 828
  • [46] Bandit Convex Optimization for Scalable and Dynamic IoT Management
    Chen, Tianyi
    Giannakis, Georgios B.
    IEEE INTERNET OF THINGS JOURNAL, 2019, 6 (01) : 1276 - 1286
  • [47] An extrapolated fixed-point optimization method for strongly convex smooth optimizations
    Rakjarungkiat, Duangdaw
    Nimana, Nimit
    AIMS MATHEMATICS, 2024, 9 (02): : 4259 - 4280
  • [48] Bandit Convex Optimization in Non-stationary Environments
    Zhao, Peng
    Wang, Guanghui
    Zhang, Lijun
    Zhou, Zhi-Hua
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1508 - 1517
  • [49] Kernel-based Methods for Bandit Convex Optimization
    Bubeck, Sebastien
    Eldan, Ronen
    Lee, Yin Tat
    JOURNAL OF THE ACM, 2021, 68 (04)
  • [50] Bandit Convex Optimization in Non-stationary Environments
    Zhao, Peng
    Wang, Guanghui
    Zhang, Lijun
    Zhou, Zhi-Hua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22