Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization

被引:0
|
作者
Gong, Chengyue [1 ]
Peng, Jian [2 ]
Liu, Qiang [1 ]
机构
[1] UT Austin, Dept Comp Sci, Austin, TX 78712 USA
[2] Univ Illinois, Champaign, IL USA
基金
美国国家科学基金会;
关键词
REPRESENTATION; RISK;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Batch Bayesian optimization has been shown to be an efficient and successful approach for blackbox function optimization, especially when the evaluation of cost function is highly expensive but can be efficiently parallelized. In this paper, we introduce a novel variational framework for batch query optimization, based on the argument that the query batch should be selected to have both high diversity and good worst case performance. This motivates us to introduce a variational objective that combines a quantile-based risk measure (for worst case performance) and entropy regularization (for enforcing diversity). We derive a gradient-based particle optimization algorithm for solving our quantile-based variational objective, which generalizes Stein variational gradient descent (SVGD) by Liu & Wang (2016). We evaluate our method on a number of real-world applications, and show that it consistently outperforms other recent state-of-the-art batch Bayesian optimization methods.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Gradient-free Stein variational gradient descent with kernel approximation
    Yan, Liang
    Zou, Xiling
    APPLIED MATHEMATICS LETTERS, 2021, 121 (121)
  • [22] Further analysis of multilevel Stein variational gradient descent with an application to the Bayesian inference of glacier ice models
    Alsup, Terrence
    Hartland, Tucker
    Peherstorfer, Benjamin
    Petra, Noemi
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2024, 50 (04)
  • [23] A STOCHASTIC VERSION OF STEIN VARIATIONAL GRADIENT DESCENT FOR EFFICIENT SAMPLING
    Li, Lei
    Li, Yingzhou
    Liu, Jian-Guo
    Liu, Zibu
    Lu, Jianfeng
    COMMUNICATIONS IN APPLIED MATHEMATICS AND COMPUTATIONAL SCIENCE, 2020, 15 (01) : 37 - 63
  • [24] Learning to Draw Samples with Amortized Stein Variational Gradient Descent
    Feng, Yihao
    Wang, Dilin
    Liu, Qiang
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI2017), 2017,
  • [25] Density Estimation-Based Stein Variational Gradient Descent
    Kim, Jeongho
    Lee, Byungjoon
    Min, Chohong
    Park, Jaewoo
    Ryu, Keunkwan
    COGNITIVE COMPUTATION, 2025, 17 (01)
  • [26] Stein Variational Gradient Descent with Matrix-Valued Kernels
    Wang, Dilin
    Tang, Ziyang
    Bajaj, Chandrajit
    Liu, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [27] Unsupervised Anomaly Detection & Diagnosis: A Stein Variational Gradient Descent Approach
    Chen, Zhichao
    Ding, Leilei
    Huang, Jianmin
    Chu, Zhixuan
    Dai, Qingyang
    Wang, Hao
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3783 - 3787
  • [28] Convergence of Stein Variational Gradient Descent under a Weaker Smoothness Condition
    Sun, Lukang
    Karagulyan, Avetik
    Richtarik, Peter
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [29] Accelerating Convergence of Stein Variational Gradient Descent via Deep Unfolding
    Kawamura, Yuya
    Takabe, Satoshi
    IEEE ACCESS, 2024, 12 : 177911 - 177918
  • [30] SCALING LIMIT OF THE STEIN VARIATIONAL GRADIENT DESCENT: THE MEAN FIELD REGIME
    Lu, Jianfeng
    Lu, Yulong
    Nolen, James
    SIAM JOURNAL ON MATHEMATICAL ANALYSIS, 2019, 51 (02) : 648 - 671