Entropic Priors and Bayesian Model Selection

被引:5
|
作者
Brewer, Brendon J. [1 ]
Francis, Matthew J. [2 ]
机构
[1] Univ Calif Santa Barbara, Dept Phys, Santa Barbara, CA 93106 USA
[2] SiSSA, Trieste, Italy
关键词
Inference; Model Selection; Dark Energy; COSMOLOGY; SUPERNOVAE; LAMBDA;
D O I
10.1063/1.3275612
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We demonstrate that the principle of maximum relative entropy (ME), used judiciously, can ease the Specification of priors in model selection problems. The resulting effect is that models that make sharp predictions are disfavoured, weakening the usual Bayesian "Occam's Razor". This is illustrated with a simple example involving what Jaynes called a "sure thing" hypothesis. Jaynes' resolution of the situation involved introducing a large number of alternative "sure thing" hypotheses that were possible before we observed the data. However, in more complex situations, it may not be possible to explicitly enumerate large numbers of alternatives. The entropic priors formalism produces the desired result without modifying the hypothesis space or requiring explicit enumeration of alternatives; all that is required is a good model for the prior predictive distribution for the data. This idea is illustrated with a simple rigged-lottery example, and we outline how this idea may help to resolve a recent debate amongst cosmologists: is dark energy a cosmological constant, or has it evolved with time in some way? And how shall we decide, when the data are in?
引用
收藏
页码:179 / +
页数:2
相关论文
共 50 条
  • [31] Neutron Spectrometry at High-energy Accelerator Facilities: A Bayesian Approach Using Entropic Priors
    Reginatto, Marcel
    BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2012, 1443 : 322 - 329
  • [32] Gibbs Priors for Bayesian Nonparametric Variable Selection with Weak Learners
    Linero, Antonio R.
    Du, Junliang
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2023, 32 (03) : 1046 - 1059
  • [33] Model selection with low complexity priors
    Vaiter, Samuel
    Golbabaee, Mohammad
    Fadili, Jalal
    Peyre, Gabriel
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2015, 4 (03) : 230 - 287
  • [34] Bayesian compositional regression with structured priors for microbiome feature selection
    Zhang, Liangliang
    Shi, Yushu
    Jenq, Robert R.
    Do, Kim-Anh
    Peterson, Christine B.
    BIOMETRICS, 2021, 77 (03) : 824 - 838
  • [35] Consistency of Sequence Classification with Entropic Priors
    Palmieri, Francesco A. N.
    Ciuonzo, Domenico
    BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING, 2012, 1443 : 338 - 345
  • [36] Bayesian model selection in linear mixed effects models with autoregressive(p) errors using mixture priors
    Fan, Tsai-Hung
    Wang, Yi-Fu
    Zhang, Yi-Chen
    JOURNAL OF APPLIED STATISTICS, 2014, 41 (08) : 1814 - 1829
  • [37] Sparsifying priors for Bayesian uncertainty quantification in model discovery
    Hirsh, Seth M.
    Barajas-Solano, David A.
    Kutz, J. Nathan
    ROYAL SOCIETY OPEN SCIENCE, 2022, 9 (02):
  • [38] NONINFORMATIVE PRIORS AND BAYESIAN TESTING FOR THE AR(1) MODEL
    BERGER, JO
    YANG, RY
    ECONOMETRIC THEORY, 1994, 10 (3-4) : 461 - 482
  • [39] Informative Priors for the Consensus Ranking in the Bayesian Mallows Model
    Crispino, Marta
    Antoniano-Villalobos, Isadora
    BAYESIAN ANALYSIS, 2023, 18 (02): : 391 - 414
  • [40] Bayesian analysis for the Lomax model using noninformative priors
    He, Daojiang
    Sun, Dongchu
    Zhu, Qing
    STATISTICAL THEORY AND RELATED FIELDS, 2023, 7 (01) : 61 - 68