Investigation of Alternative Measures for Mutual Information

被引:2
|
作者
Kuskonmaz, Bulut [1 ]
Gundersen, Jaron S. [1 ]
Wisniewski, Rafal [1 ]
机构
[1] Aalborg Univ, Dept Elect Syst, Aalborg, Denmark
来源
IFAC PAPERSONLINE | 2022年 / 55卷 / 16期
关键词
D O I
10.1016/j.ifacol.2022.09.016
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Mutual information I(X; Y) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X. However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances. Copyright (C) 2022 The Authors.
引用
收藏
页码:154 / 159
页数:6
相关论文
共 50 条
  • [21] Input variable selection: Mutual information and linear mixing measures
    Trappenberg, T
    Ouyang, J
    Back, A
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2006, 18 (01) : 37 - 46
  • [22] On conclusive eavesdropping and measures of mutual information in quantum key distribution
    Alexey E. Rastegin
    Quantum Information Processing, 2016, 15 : 1225 - 1239
  • [23] Reply to "Comment on 'Multiparty quantum mutual information: An alternative definition' "
    Kumar, Asutosh
    PHYSICAL REVIEW A, 2023, 108 (06)
  • [24] Information Entropy and Mutual Information-based Uncertainty Measures in Rough Set Theory
    Sun, Lin
    Xu, Jiucheng
    APPLIED MATHEMATICS & INFORMATION SCIENCES, 2014, 8 (04): : 1973 - 1985
  • [25] Orthopartitions and soft clustering: Soft mutual information measures for clustering validation
    Campagner, Andrea
    Ciucci, Davide
    KNOWLEDGE-BASED SYSTEMS, 2019, 180 : 51 - 61
  • [26] Synchronization in Simple Network Motifs with Negligible Correlation and Mutual Information Measures
    Soriano, Miguel C.
    Van der Sande, Guy
    Fischer, Ingo
    Mirasso, Claudio R.
    PHYSICAL REVIEW LETTERS, 2012, 108 (13)
  • [27] Use of Average Mutual Information and Derived Measures to Find Coding Regions
    Newcomb, Garin
    Sayood, Khalid
    ENTROPY, 2021, 23 (10)
  • [28] POLARIMETRIC SAR DATA FEATURE SELECTION USING MEASURES OF MUTUAL INFORMATION
    Tanase, R.
    Radoi, A.
    Datcu, M.
    Raducanu, D.
    2015 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2015, : 1140 - 1143
  • [29] Entropy, mutual information, and systematic measures of structured spiking neural networks
    Li, Wenjie
    Li, Yao
    JOURNAL OF THEORETICAL BIOLOGY, 2020, 501
  • [30] Statistical validation of mutual information calculations: Comparison of alternative numerical algorithms
    Cellucci, CJ
    Albano, AM
    Rapp, PE
    PHYSICAL REVIEW E, 2005, 71 (06)