Entropy and Information Content of Geostatistical Models

被引:0
|
作者
Thomas Mejer Hansen
机构
[1] Aarhus University,Department of Geoscience
来源
Mathematical Geosciences | 2021年 / 53卷
关键词
Entropy; Simulation; Multiple-point statistics;
D O I
暂无
中图分类号
学科分类号
摘要
Geostatistical models quantify spatial relations between model parameters and can be used to estimate and simulate properties away from known observations. The underlying statistical model, quantified through a joint probability density, most often consists of both an assumed statistical model and the specific choice of algorithm, including tuning parameters controlling the algorithm. Here, a theory is developed that allows one to compute the entropy of the underlying multivariate probability density when sampled using sequential simulation. The self-information of a single realization can be computed as the sum of the conditional self-information. The entropy is the average of the self-information obtained for many independent realizations. For discrete probability mass functions, a measure of the effective number of free model parameters, implied by a specific choice of probability mass function, is proposed. Through a few examples, the entropy measure is used to quantify the information content related to different choices of simulation algorithms and tuning parameters.
引用
收藏
页码:163 / 184
页数:21
相关论文
共 50 条
  • [1] Entropy and Information Content of Geostatistical Models
    Hansen, Thomas Mejer
    MATHEMATICAL GEOSCIENCES, 2021, 53 (01) : 163 - 184
  • [2] Quantifying Information Content in Survey Data by Entropy
    Dahl, Fredrik A.
    Osteras, Nina
    ENTROPY, 2010, 12 (02) : 161 - 163
  • [3] Entropy and information content of laboratory test results
    Vollmer, Robin T.
    AMERICAN JOURNAL OF CLINICAL PATHOLOGY, 2007, 127 (01) : 60 - 65
  • [4] Entropy and information rates for hidden Markov models
    Ko, HS
    Baran, RH
    1998 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY - PROCEEDINGS, 1998, : 374 - 374
  • [5] Entropy Power, Autoregressive Models, and Mutual Information
    Gibson, Jerry
    ENTROPY, 2018, 20 (10)
  • [6] INFORMATION-CONTENT OF PRODUCTION MODELS
    CAMISASSA, G
    FRE, P
    SERTORIO, L
    NUOVO CIMENTO DELLA SOCIETA ITALIANA DI FISICA B-GENERAL PHYSICS RELATIVITY ASTRONOMY AND MATHEMATICAL PHYSICS AND METHODS, 1975, 30 (02): : 364 - 389
  • [7] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [9] Entropy increase and information loss in Markov models of evolution
    Sober, Elliott
    Steel, Mike
    BIOLOGY & PHILOSOPHY, 2011, 26 (02) : 223 - 250
  • [10] Entropy increase and information loss in Markov models of evolution
    Elliott Sober
    Mike Steel
    Biology & Philosophy, 2011, 26 : 223 - 250