Discriminative training of feed-forward and recurrent sum-product networks by extended Baum-Welch

被引:2
|
作者
Duan, Haonan [1 ,2 ]
Rashwan, Abdullah [1 ,2 ]
Poupart, Pascal [1 ,2 ]
Chen, Zhitang [3 ]
机构
[1] Univ Waterloo, Waterloo AI Inst, Waterloo, ON, Canada
[2] Vector Inst, Toronto, ON, Canada
[3] Huawei Technol, Hong Kong, Peoples R China
关键词
Sum-product network; Extended Baum-Welch; Discriminative learning; PROBABILISTIC FUNCTIONS; STATISTICAL ESTIMATION; MAXIMIZATION; INEQUALITY;
D O I
10.1016/j.ijar.2020.02.007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a discriminative learning algorithm for feed-forward Sum-Product Networks (SPNs) [42] and recurrent SPNs [31] based on the Extended Baum-Welch (EBW) algorithm [4]. We formulate the conditional data likelihood in the SPN framework as a rational function, and we use EBW to monotonically maximize it. We derive the algorithm for SPNs and RSPNs with both discrete and continuous variables. The experiments show that this algorithm performs better than both generative Expectation-Maximization, and discriminative gradient descent on a wide variety of applications. We also demonstrate the robustness of the algorithm in the case of missing features by comparing its performance to Support Vector Machines and Neural Networks. (C) 2020 Elsevier Inc. All rights reserved.
引用
收藏
页码:66 / 81
页数:16
相关论文
共 50 条
  • [32] A training-time analysis of robustness in feed-forward neural networks
    Alippi, C
    Sana, D
    Scotti, F
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 2853 - 2858
  • [33] Dynamic group optimisation algorithm for training feed-forward neural networks
    Tang, Rui
    Fong, Simon
    Deb, Suash
    Vasilakos, Athanasios V.
    Millham, Richard C.
    NEUROCOMPUTING, 2018, 314 : 1 - 19
  • [34] Feed-forward and recurrent neural networks for source code informal information analysis
    Merlo, E
    McAdam, I
    De Mori, R
    JOURNAL OF SOFTWARE MAINTENANCE AND EVOLUTION-RESEARCH AND PRACTICE, 2003, 15 (04): : 205 - 244
  • [35] Resistive switching synapses for unsupervised learning in feed-forward and recurrent neural networks
    Milo, V.
    Pedretti, G.
    Laudato, M.
    Bricalli, A.
    Ambrosi, E.
    Bianchi, S.
    Chicca, E.
    Ielmini, D.
    2018 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2018,
  • [36] Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling
    Beg, Azam
    Prasad, P. W. Chandana
    Beg, Ajmal
    EXPERT SYSTEMS WITH APPLICATIONS, 2008, 34 (04) : 2436 - 2443
  • [37] Non-iterative approaches in training feed-forward neural networks and their applications
    Xizhao Wang
    Weipeng Cao
    Soft Computing, 2018, 22 : 3473 - 3476
  • [38] An improved butterfly optimization algorithm for training the feed-forward artificial neural networks
    Irmak, Busra
    Karakoyun, Murat
    Gulcu, Saban
    SOFT COMPUTING, 2023, 27 (07) : 3887 - 3905
  • [39] A Modified Invasive Weed Optimization Algorithm for Training of Feed-Forward Neural Networks
    Giri, Ritwik
    Chowdhury, Aritra
    Ghosh, Arnob
    Das, Swagatam
    Abraham, Ajith
    Snasel, Vaclav
    IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010, : 3166 - 3173
  • [40] Non-iterative approaches in training feed-forward neural networks and their applications
    Wang, Xizhao
    Cao, Weipeng
    SOFT COMPUTING, 2018, 22 (11) : 3473 - 3476