On the relationship between deterministic and probabilistic directed Graphical models: From Bayesian networks to recursive neural networks

被引:11
|
作者
Baldi, P [1 ]
Rosen-Zvi, M
机构
[1] Univ Calif Irvine, Sch Informat & Comp Sci, Irvine, CA 92697 USA
[2] Univ Calif Irvine, Inst Genom & Bioinformat, Irvine, CA 92697 USA
[3] Hebrew Univ Jerusalem, Sch Comp Sci & Engn, IL-91904 Jerusalem, Israel
基金
美国国家卫生研究院; 美国国家科学基金会;
关键词
Bayesian networks; belief propagation; recursive neural networks; recurrent neural networks; constraint networks graphical models;
D O I
10.1016/j.neunet.2005.07.007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine learning methods that can handle variable-size structured data such as sequences and graphs include Bayesian networks (BNs) and Recursive Neural Networks (RNNs). In both classes of models, the data is modeled using a set of observed and hidden variables associated with the nodes of a directed acyclic graph. In BNs, the conditional relationships between parent and child variables are probabilistic, whereas in RNNs they are deterministic and parameterized by neural networks. Here, we study the formal relationship between both classes of models and show that when the source nodes variables are observed, RNNs can be viewed as limits, both in distribution and probability, of BNs with local conditional distributions that have vanishing covariance matrices and converge to delta functions. Conditions for uniform convergence are also given together with an analysis of the behavior and exactness of Belief Propagation (BP) in 'deterministic' BNs. Implications for the design of mixed architectures and the corresponding inference algorithms are briefly discussed. (c) 2005 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1080 / 1086
页数:7
相关论文
共 50 条
  • [41] Deterministic neural networks as sources of uncorrelated noise for probabilistic computations
    Jakob Jordan
    Tom Tetzlaff
    Mihai Petrovici
    Oliver Breitwieser
    Ilja Bytschok
    Johannes Bill
    Johannes Schemmel
    Karlheinz Meier
    Markus Diesmann
    BMC Neuroscience, 16 (Suppl 1)
  • [42] New probabilistic graphical models for genetic regulatory networks studies
    Wang, JB
    Cheung, LWK
    Delabie, J
    JOURNAL OF BIOMEDICAL INFORMATICS, 2005, 38 (06) : 443 - 455
  • [43] Likelihood analysis of phylogenetic networks using directed graphical models
    Strimmer, K
    Moulton, V
    MOLECULAR BIOLOGY AND EVOLUTION, 2000, 17 (06) : 875 - 881
  • [44] Graphical Structure of Bayesian Networks by Eliciting Mental Models of Experts
    Kudikyala, Udai Kumar
    Bugudapu, Mounika
    Jakkula, Manasa
    SMART COMPUTING AND INFORMATICS, 2018, 77 : 333 - 341
  • [45] ON THE STATISTICAL MECHANICS OF PROBABILISTIC MODELS OF NEURAL NETWORKS
    钱敏平
    龚光鲁
    谢松茂
    Acta Mathematicae Applicatae Sinica(English Series), 1995, (03) : 292 - 299
  • [46] Using Graphical Models as Explanations in Deep Neural Networks
    Le, Franck
    Srivatsa, Mudhakar
    Reddy, Krishna Kesari
    Roy, Kaushik
    2019 IEEE 16TH INTERNATIONAL CONFERENCE ON MOBILE AD HOC AND SMART SYSTEMS (MASS 2019), 2019, : 283 - 289
  • [47] Bayesian methods for neural networks and related models
    Titterington, DM
    STATISTICAL SCIENCE, 2004, 19 (01) : 128 - 139
  • [48] On the Relationship between Logical Bayesian Networks and Probabilistic Logic Programming Based on the Distribution Semantics
    Fierens, Daan
    INDUCTIVE LOGIC PROGRAMMING, 2010, 5989 : 17 - 24
  • [49] Translating Bayesian Networks into Entity Relationship Models
    Rosner, Frank
    Hinneburg, Alexander
    CONCEPTUAL MODELING, ER 2016, 2016, 9974 : 65 - 72
  • [50] On the Relationship between Sum-Product Networks and Bayesian Networks
    Zhao, Han
    Melibari, Mazen
    Poupart, Pascal
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 116 - 124