Regularized Stein Variational Gradient Flow

被引:0
|
作者
He, Ye [1 ]
Balasubramanian, Krishnakumar [2 ]
Sriperumbudur, Bharath K. [3 ]
Lu, Jianfeng [4 ]
机构
[1] Georgia Inst Technol, Sch Math, 686 Cherry St, Atlanta, GA 30332 USA
[2] Univ Calif Davis, Dept Stat, 399 Crocker Lane,1 Shields Ave, Davis, CA 95616 USA
[3] Penn State Univ, Dept Stat, 314 Thomas Bldg, University Pk, PA 16802 USA
[4] Duke Univ, Math Dept, Box 90320,120 Sci Dr, Durham, NC 27708 USA
关键词
Wasserstein gradient flow; Stein variational gradient descent; Particle-based sampling; Convergence to equilibrium; Mean-field analysis; Reproducing kernel Hilbert space; Regularization; CONVERGENCE; DIFFUSION; KERNELS;
D O I
10.1007/s10208-024-09663-w
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The stein variational gradient descent (SVGD) algorithm is a deterministic particle method for sampling. However, a mean-field analysis reveals that the gradient flow corresponding to the SVGD algorithm (i.e., the Stein Variational Gradient Flow) only provides a constant-order approximation to the Wasserstein gradient flow corresponding to the KL-divergence minimization. In this work, we propose the Regularized Stein Variational Gradient Flow, which interpolates between the Stein Variational Gradient Flow and the Wasserstein gradient flow. We establish various theoretical properties of the Regularized Stein Variational Gradient Flow (and its time-discretization) including convergence to equilibrium, existence and uniqueness of weak solutions, and stability of the solutions. We provide preliminary numerical evidence of the improved performance offered by the regularization.
引用
收藏
页数:59
相关论文
共 50 条
  • [31] Towards Understanding the Dynamics of Gaussian-Stein Variational Gradient Descent
    Liu, Tianle
    Ghosal, Promit
    Balasubramanian, Krishnakumar
    Pillai, Natesh S.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [32] Accelerating Convergence of Stein Variational Gradient Descent via Deep Unfolding
    Kawamura, Yuya
    Takabe, Satoshi
    IEEE ACCESS, 2024, 12 : 177911 - 177918
  • [33] Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent
    Kassab, Rahif
    Simeone, Osvaldo
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 2180 - 2192
  • [34] Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent
    Liu, Xingchao
    Tong, Xin T.
    Liu, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [35] Learning Equivariant Energy Based Models with Equivariant Stein Variational Gradient Descent
    Jaini, Priyank
    Holdijk, Lars
    Welling, Max
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [36] A regularized gradient flow for the p-elastic energy
    Blatt, Simon
    Hopper, Christopher
    Vorderobermeier, Nicole
    ADVANCES IN NONLINEAR ANALYSIS, 2022, 11 (01) : 1383 - 1411
  • [37] STEIN VARIATIONAL GRADIENT DESCENT: MANY-PARTICLE AND LONG-TIME ASYMPTOTICS
    Nusken, Nikolas
    Renger, D. R. Michiel
    FOUNDATIONS OF DATA SCIENCE, 2023, 5 (03): : 286 - 320
  • [38] Stabilizing Training of Generative Adversarial Nets via Langevin Stein Variational Gradient Descent
    Wang, Dong
    Qin, Xiaoqian
    Song, Fengyi
    Cheng, Li
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) : 2768 - 2780
  • [39] Bayesian full waveform inversion of surface waves with annealed stein variational gradient descent
    Berti, Sean
    Ravasi, Matteo
    Aleardi, Mattia
    Stucchi, Eusebio
    GEOPHYSICAL JOURNAL INTERNATIONAL, 2025, 241 (01) : 641 - 657
  • [40] p-Kernel Stein Variational Gradient Descent for Data Assimilation and History Matching
    Stordal, Andreas S.
    Moraes, Rafael J.
    Raanes, Patrick N.
    Evensen, Geir
    MATHEMATICAL GEOSCIENCES, 2021, 53 (03) : 375 - 393