Investigation of Alternative Measures for Mutual Information

被引:2
|
作者
Kuskonmaz, Bulut [1 ]
Gundersen, Jaron S. [1 ]
Wisniewski, Rafal [1 ]
机构
[1] Aalborg Univ, Dept Elect Syst, Aalborg, Denmark
来源
IFAC PAPERSONLINE | 2022年 / 55卷 / 16期
关键词
D O I
10.1016/j.ifacol.2022.09.016
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Mutual information I(X; Y) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances. Copyright (C) 2022 The Authors.
引用
收藏
页码:154 / 159
页数:6
相关论文
共 50 条
  • [41] Topics Inference by Weighted Mutual Information Measures Computed from Structured Corpus
    Chang, Harry
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS, 2011, 6716 : 64 - 75
  • [42] Mutual association measures
    Borroni, Claudio G.
    STATISTICAL METHODS AND APPLICATIONS, 2019, 28 (04): : 571 - 591
  • [43] Mutual association measures
    Claudio G. Borroni
    Statistical Methods & Applications, 2019, 28 : 571 - 591
  • [44] α-Mutual Information
    Verdu, Sergio
    2015 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2015, : 1 - 6
  • [45] Research synthesis of information theory measures of uncertainty: Meta-analysis of entropy and mutual information of diagnostic tests
    Tsalatsanis, Athanasios
    Hozo, Iztok
    Djulbegovic, Benjamin
    JOURNAL OF EVALUATION IN CLINICAL PRACTICE, 2021, 27 (02) : 246 - 255
  • [46] Comparison of co-expression measures: mutual information, correlation, and model based indices
    Lin Song
    Peter Langfelder
    Steve Horvath
    BMC Bioinformatics, 13
  • [47] ORIENT: Submodular Mutual Information Measures for Data Subset Selection under Distribution Shift
    Karanam, Athresh
    Killamsetty, Krishnateja
    Kokel, Harsha
    Iyer, Rishabh K.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [48] Distributed multi-label feature selection using individual mutual information measures
    Gonzalez-Lopez, Jorge
    Ventura, Sebastian
    Cano, Alberto
    KNOWLEDGE-BASED SYSTEMS, 2020, 188 (188)
  • [49] Objective Intelligibility Measures Based on Mutual Information for Speech Subjected to Speech Enhancement Processing
    Taghia, Jalal
    Martin, Rainer
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2014, 22 (01) : 6 - 16
  • [50] Comparison of co-expression measures: mutual information, correlation, and model based indices
    Song, Lin
    Langfelder, Peter
    Horvath, Steve
    BMC BIOINFORMATICS, 2012, 13