Optimal Belief Approximation

被引:12
|
作者
Leike, Reimar H. [1 ,2 ]
Ensslin, Torsten A. [1 ,2 ]
机构
[1] Max Planck Inst Astrophys, Karl Schwarzschildstr 1, D-85748 Garching, Germany
[2] Ludwig Maximilians Univ Munchen, Geschwister Scholl Pl 1, D-80539 Munich, Germany
来源
ENTROPY | 2017年 / 19卷 / 08期
关键词
information theory; Bayesian inference; loss function; axiomatic derivation; machine learning; RELATIVE ENTROPY;
D O I
10.3390/e19080402
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In Bayesian statistics probability distributions express beliefs. However, for many problems the beliefs cannot be computed analytically and approximations of beliefs are needed. We seek a loss function that quantifies how "embarrassing" it is to communicate a given approximation. We reproduce and discuss an old proof showing that there is only one ranking under the requirements that (1) the best ranked approximation is the non-approximated belief and (2) that the ranking judges approximations only by their predictions for actual outcomes. The loss function that is obtained in the derivation is equal to the Kullback-Leibler divergence when normalized. This loss function is frequently used in the literature. However, there seems to be confusion about the correct order in which its functional arguments-the approximated and non-approximated beliefs-should be used. The correct order ensures that the recipient of a communication is only deprived of the minimal amount of information. We hope that the elementary derivation settles the apparent confusion. For example when approximating beliefs with Gaussian distributions the optimal approximation is given by moment matching. This is in contrast to many suggested computational schemes.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Approximation of belief functions
    Weiler, T
    INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2003, 11 (06) : 749 - 777
  • [2] Tree approximation for belief updating
    Mateescu, R
    Dechter, R
    Kask, K
    EIGHTEENTH NATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (AAAI-02)/FOURTEENTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE (IAAI-02), PROCEEDINGS, 2002, : 553 - 559
  • [3] Belief Propagation, Bethe Approximation and Polynomials
    Straszak, Damian
    Vishnoi, Nisheeth K.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (07) : 4353 - 4363
  • [4] Approximation of Data by Decomposable Belief Models
    Jirousek, Radim
    INFORMATION PROCESSING AND MANAGEMENT OF UNCERTAINTY IN KNOWLEDGE-BASED SYSTEMS: THEORY AND METHODS, PT 1, 2010, 80 : 40 - 49
  • [5] Belief Propagation, Bethe Approximation and Polynomials
    Straszak, Damian
    Vishnoi, Nisheeth K.
    2017 55TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2017, : 666 - 671
  • [6] α Belief Propagation as Fully Factorized Approximation
    Liu, Dong
    Moghadam, Nima N.
    Rasmussen, Lars K.
    Huang, Jinliang
    Chatterjee, Saikat
    2019 7TH IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (IEEE GLOBALSIP), 2019,
  • [7] Optimality Guarantees for Particle Belief Approximation of POMDPs
    Lim M.H.
    Becker T.J.
    Kochenderfer M.J.
    Tomlin C.J.
    Sunberg Z.N.
    Journal of Artificial Intelligence Research, 2023, 77 : 1591 - 1636
  • [8] Bayesian approximation and invariance of Bayesian belief functions
    Joshi, AV
    Sahasrabudhe, SC
    Shankar, K
    SYMBOLIC AND QUANTITATIVE APPROACHES TO REASONING AND UNCERTAINTY, 1995, 946 : 251 - 258
  • [9] Truth approximation, belief merging, and peer disagreement
    Cevolani, Gustavo
    SYNTHESE, 2014, 191 (11) : 2383 - 2401
  • [10] Active Inference, Belief Propagation, and the Bethe Approximation
    Schwoebel, Sarah
    Kiebel, Stefan
    Markovic, Dimitrije
    NEURAL COMPUTATION, 2018, 30 (09) : 2530 - 2567