Investigation of Alternative Measures for Mutual Information

被引:2
|
作者
Kuskonmaz, Bulut [1 ]
Gundersen, Jaron S. [1 ]
Wisniewski, Rafal [1 ]
机构
[1] Aalborg Univ, Dept Elect Syst, Aalborg, Denmark
来源
IFAC PAPERSONLINE | 2022年 / 55卷 / 16期
关键词
D O I
10.1016/j.ifacol.2022.09.016
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Mutual information I(X; Y) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances. Copyright (C) 2022 The Authors.
引用
收藏
页码:154 / 159
页数:6
相关论文
共 50 条
  • [31] Optimal binning for a variance based alternative of mutual information in pattern recognition
    Fazekas, Attila
    Kovacs, Gyorgy
    NEUROCOMPUTING, 2023, 519 : 135 - 147
  • [32] A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables
    Marrelec, Guillaume
    Messe, Arnaud
    Bellec, Pierre
    PLOS ONE, 2015, 10 (09):
  • [33] Investigation of Average Mutual Information for Species Separation Using GSOM
    Chan, Chon-Kit Kenneth
    Halgamuge, Saman
    FUTURE GENERATION INFORMATION TECHNOLOGY, PROCEEDINGS, 2009, 5899 : 42 - 49
  • [34] Alternative measures of health information and demand for fats and oils in Japan
    Kim, SR
    Chern, WS
    JOURNAL OF CONSUMER AFFAIRS, 1999, 33 (01) : 92 - 109
  • [35] Mutual information based distance measures for classification and content recognition with applications to genetics
    Dawy, Z
    Hagenauer, J
    Hanus, P
    Mueller, JC
    ICC 2005: IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, VOLS 1-5, 2005, : 820 - 824
  • [36] Normalized Measures of Mutual Information with General Definitions of Entropy for Multimodal Image Registration
    Cahill, Nathan D.
    BIOMEDICAL IMAGE REGISTRATION, 2010, 6204 : 258 - 268
  • [37] Quantification of the relationship between MEG and fMRI datasets through measures of mutual information
    Barnes, GR
    Hillebrand, A
    Singh, KD
    JOURNAL OF PSYCHOPHYSIOLOGY, 2003, 17 (04) : 229 - 229
  • [38] Low overlap image registration based on both entropy and mutual information measures
    de Cesare, Cedric
    Rendas, Maria-Joao
    Allais, Anne-Gaelle
    Perrier, Michel
    OCEANS 2008, VOLS 1-4, 2008, : 1791 - +
  • [39] Feature selection using mutual information based uncertainty measures for tumor classification
    Sun, Lin
    Xu, Jiucheng
    BIO-MEDICAL MATERIALS AND ENGINEERING, 2014, 24 (01) : 763 - 770
  • [40] Mutual Information Measures for Subclass Error-Correcting Output Codes Classification
    Arvanitopoulos, Nikolaos
    Bouzas, Dimitrios
    Tefas, Anastasios
    ARTIFICIAL INTELLIGENCE: THEORIES, MODELS AND APPLICATIONS, PROCEEDINGS, 2010, 6040 : 19 - +