Bayesian optimization for likelihood-free cosmological inference

被引:53
|
作者
Leclercq, Florent [1 ,2 ]
机构
[1] Imperial Coll London, Blackett Lab, ICIC, Prince Consort Rd, London SW7 2AZ, England
[2] Imperial Coll London, Blackett Lab, Astrophys Grp, Prince Consort Rd, London SW7 2AZ, England
关键词
DATA-COMPRESSION;
D O I
10.1103/PhysRevD.98.063511
中图分类号
P1 [天文学];
学科分类号
0704 ;
摘要
Many cosmological models have only a finite number of parameters of interest, but a very expensive data-generating process and an intractable likelihood function. We address the problem of performing likelihood-free Bayesian inference from such black-box simulation-based models, under the constraint of a very limited simulation budget (typically a few thousand). To do so, we adopt an approach based on the likelihood of an alternative parametric model. Conventional approaches to approximate Bayesian computation such as likelihood-free rejection sampling are impractical for the considered problem, due to the lack of knowledge about how the parameters affect the discrepancy between observed and simulated data. As a response, we make use of a strategy previously developed in the machine learning literature (Bayesian optimization for likelihood-free inference, BOLFI), which combines Gaussian process regression of the discrepancy to build a surrogate surface with Bayesian optimization to actively acquire training data. We extend the method by deriving an acquisition function tailored for the purpose of minimizing the expected uncertainty in the approximate posterior density, in the parametric approach. The resulting algorithm is applied to the problems of summarizing Gaussian signals and inferring cosmological parameters from the joint lightcurve analysis supernovae data. We show that the number of required simulations is reduced by several orders of magnitude, and that the proposed acquisition function produces more accurate posterior approximations, as compared to common strategies.
引用
收藏
页数:24
相关论文
共 50 条
  • [21] Sample-efficient neural likelihood-free Bayesian inference of implicit HMMs
    Ghosh, Sanmitra
    Birre, Paul J.
    De Angelis, Daniela
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [22] Efficient multifidelity likelihood-free Bayesian inference with adaptive computational resource allocation
    Prescott, Thomas P.
    Warne, David J.
    Baker, Ruth E.
    JOURNAL OF COMPUTATIONAL PHYSICS, 2024, 496
  • [23] Bayesian Learning of Conditional Kernel Mean Embeddings for Automatic Likelihood-Free Inference
    Hsu, Kelvin
    Ramos, Fabio
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [24] Probabilistic damage detection using a new likelihood-free Bayesian inference method
    Jice Zeng
    Michael D. Todd
    Zhen Hu
    Journal of Civil Structural Health Monitoring, 2023, 13 : 319 - 341
  • [25] Bayesian Symbol Detection in Wireless Relay Networks via Likelihood-Free Inference
    Peters, Gareth W.
    Nevat, Ido
    Sisson, Scott A.
    Fan, Yanan
    Yuan, Jinhong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (10) : 5206 - 5218
  • [26] Likelihood-Free Inference of Population Structure and Local Adaptation in a Bayesian Hierarchical Model
    Bazin, Eric
    Dawson, Kevin J.
    Beaumont, Mark A.
    GENETICS, 2010, 185 (02) : 587 - 602
  • [27] PET-ABC: fully Bayesian likelihood-free inference for kinetic models
    Fan, Yanan
    Emvalomenos, Gaelle
    Grazian, Clara
    Meikle, Steven R.
    PHYSICS IN MEDICINE AND BIOLOGY, 2021, 66 (11):
  • [28] Optimization Monte Carlo: Efficient and Embarrassingly Parallel Likelihood-Free Inference
    Meeds, Edward
    Welling, Max
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [29] Sequential Likelihood-Free Inference with Neural Proposal
    Kim, Dongjun
    Song, Kyungwoo
    Kim, Yoon-Yeong
    Shin, Yongjin
    Kang, Wanmo
    Moon, Il-Chul
    Joo, Weonyoung
    PATTERN RECOGNITION LETTERS, 2023, 169 : 102 - 109
  • [30] Automatic Posterior Transformation for Likelihood-free Inference
    Greenberg, David S.
    Nonnenmacher, Marcel
    Macke, Jakob H.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97