Entropy and the Kullback-Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation

被引:2
|
作者
Scutari, Marco [1 ]
机构
[1] Ist Dalle Molle Studi SullIntelligenza Artificiale, CH-6900 Lugano, Switzerland
关键词
Bayesian networks; Shannon entropy; Kullback-Leibler divergence; PROPAGATION; INFERENCE;
D O I
10.3390/a17010024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian networks (BNs) are a foundational model in machine learning and causal inference. Their graphical structure can handle high-dimensional problems, divide them into a sparse collection of smaller ones, underlies Judea Pearl's causality, and determines their explainability and interpretability. Despite their popularity, there are almost no resources in the literature on how to compute Shannon's entropy and the Kullback-Leibler (KL) divergence for BNs under their most common distributional assumptions. In this paper, we provide computationally efficient algorithms for both by leveraging BNs' graphical structure, and we illustrate them with a complete set of numerical examples. In the process, we show it is possible to reduce the computational complexity of KL from cubic to quadratic for Gaussian BNs.
引用
收藏
页数:32
相关论文
共 50 条
  • [31] Bayesian optimistic Kullback-Leibler exploration
    Lee, Kanghoon
    Kim, Geon-Hyeong
    Ortega, Pedro
    Lee, Daniel D.
    Kim, Kee-Eung
    MACHINE LEARNING, 2019, 108 (05) : 765 - 783
  • [32] A generalization of the Kullback-Leibler divergence and its properties
    Yamano, Takuya
    JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (04)
  • [33] Hardware Implementation of a Kullback-Leibler Divergence Based Signal Anomaly Detector
    Afgani, Mostafa
    Sinanovic, Sinan
    Haas, Harald
    2009 2ND INTERNATIONAL SYMPOSIUM ON APPLIED SCIENCES IN BIOMEDICAL AND COMMUNICATION TECHNOLOGIES (ISABEL 2009), 2009, : 517 - 522
  • [34] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [35] Generalization of the Kullback-Leibler divergence in the Tsallis statistics
    Huang, Juntao
    Yong, Wen-An
    Hong, Liu
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2016, 436 (01) : 501 - 512
  • [36] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [37] Acoustic environment identification by Kullback-Leibler divergence
    Delgado-Gutierrez, G.
    Rodriguez-Santos, F.
    Jimenez-Ramirez, O.
    Vazquez-Medina, R.
    FORENSIC SCIENCE INTERNATIONAL, 2017, 281 : 134 - 140
  • [38] Kullback-Leibler Divergence for Nonnegative Matrix Factorization
    Yang, Zhirong
    Zhang, He
    Yuan, Zhijian
    Oja, Erkki
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT I, 2011, 6791 : 250 - 257
  • [39] A Kullback-Leibler divergence based comparison of approximate Bayesian estimations of ARMA models
    Amin, Ayman A.
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2022, 29 (04) : 471 - 486
  • [40] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376