An Optimal Algorithm for Bandit Convex Optimization with Strongly-Convex and Smooth Loss

被引:0
|
作者
Ito, Shinji [1 ]
机构
[1] Univ Tokyo, NEC Corp, Tokyo, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider non-stochastic bandit convex optimization with strongly-convex and smooth loss functions. For this problem, Hazan and Levy have proposed an algorithm with a regret bound of (O) over tilde (d(3/2) root T) given access to an O(d)-self-concordant barrier over the feasible region, where d and T stand for the dimensionality of the feasible region and the number of rounds, respectively. However, there are no known efficient ways for constructing self-concordant barriers for general convex sets, and a (O) over tilde(root d) gap has remained between the upper and lower bounds, as the known regret lower bound is Omega(d root T). Our study resolves these two issues by introducing an algorithm that achieves an optimal regret bound of (O) over tilde (d root T) under a mild assumption, without self-concordant barriers. More precisely, the algorithm requires only a membership oracle for the feasible region, and it achieves an optimal regret bound of (O) over tilde (d root T) under the assumption that the optimal solution is an interior of the feasible region. Even without this assumption, our algorithm achieves (O) over tilde (d(3/2) root T)-regret.
引用
收藏
页码:2229 / 2238
页数:10
相关论文
共 50 条
  • [31] Online Bandit Convex Optimization Over A Network
    Yuan, Deming
    Ho, Daniel W. C.
    Hong, Yiguang
    Jiang, Guoping
    PROCEEDINGS OF THE 35TH CHINESE CONTROL CONFERENCE 2016, 2016, : 8090 - 8095
  • [32] Adaptive Bandit Convex Optimization with Heterogeneous Curvature
    Luo, Haipeng
    Zhang, Mengxiao
    Zhao, Peng
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [33] Projection-Free Bandit Convex Optimization
    Chen, Lin
    Zhang, Mingrui
    Karbasi, Amin
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [34] Bandit Convex Optimization: Towards Tight Bounds
    Hazan, Elad
    Levy, Kfir Y.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [35] On Online Optimization: Dynamic Regret Analysis of Strongly Convex and Smooth Problems
    Chang, Ting-Jui
    Shahrampour, Shahin
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 6966 - 6973
  • [36] MIN-MAX-MIN OPTIMIZATION WITH SMOOTH AND STRONGLY CONVEX OBJECTIVES
    Lamperski, Jourdain
    Prokopyev, Oleg A.
    Wrabetz, Luca G.
    SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) : 2435 - 2456
  • [37] Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives
    Hendrikx, Hadrien
    Bach, Francis
    Massoulie, Laurent
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 897 - 906
  • [38] Towards Faster Training Algorithms Exploiting Bandit Sampling From Convex to Strongly Convex Conditions
    Zhou, Yangfan
    Huang, Kaizhu
    Cheng, Cheng
    Wang, Xuguang
    Hussain, Amir
    Liu, Xin
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2023, 7 (02): : 565 - 577
  • [39] Event-Triggered Consensus-Based Optimization Algorithm for Smooth and Strongly Convex Cost Functions
    Hayashi, Naoki
    Sugiura, Tomohiro
    Kajiyama, Yuichi
    Takai, Shigemasa
    2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 2120 - 2125
  • [40] Faster Rates for the Frank-Wolfe Method over Strongly-Convex Sets
    Garber, Dan
    Hazan, Elad
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 541 - 549