Exploiting symmetries for scaling loopy belief propagation and relational training

被引:0
|
作者
Babak Ahmadi
Kristian Kersting
Martin Mladenov
Sriraam Natarajan
机构
[1] Fraunhofer IAIS,Knowledge Discovery Department
[2] University of Bonn,Institute for Geodesy and Geoinformation
[3] Fraunhofer IAIS,Translational Science Institute
[4] Wake Forest University,undefined
来源
Machine Learning | 2013年 / 92卷
关键词
Statistical relational learning; Lifted inference; Lifted online training; MapReduce;
D O I
暂无
中图分类号
学科分类号
摘要
Judging by the increasing impact of machine learning on large-scale data analysis in the last decade, one can anticipate a substantial growth in diversity of the machine learning applications for “big data” over the next decade. This exciting new opportunity, however, also raises many challenges. One of them is scaling inference within and training of graphical models. Typical ways to address this scaling issue are inference by approximate message passing, stochastic gradients, and MapReduce, among others. Often, we encounter inference and training problems with symmetries and redundancies in the graph structure. A prominent example are relational models that capture complexity. Exploiting these symmetries, however, has not been considered for scaling yet. In this paper, we show that inference and training can indeed benefit from exploiting symmetries. Specifically, we show that (loopy) belief propagation (BP) can be lifted. That is, a model is compressed by grouping nodes together that send and receive identical messages so that a modified BP running on the lifted graph yields the same marginals as BP on the original one, but often in a fraction of time. By establishing a link between lifting and radix sort, we show that lifting is MapReduce-able. Still, in many if not most situations training relational models will not benefit from this (scalable) lifting: symmetries within models easily break since variables become correlated by virtue of depending asymmetrically on evidence. An appealing idea for such situations is to train and recombine local models. This breaks long-range dependencies and allows to exploit lifting within and across the local training tasks. Moreover, it naturally paves the way for the first scalable lifted training approaches based on stochastic gradients, both in an online and a MapReduced fashion. On several datasets, the online training, for instance, converges to the same quality solution over an order of magnitude faster, simply because it starts optimizing long before having seen the entire mega-example even once.
引用
收藏
页码:91 / 132
页数:41
相关论文
共 50 条
  • [41] On Loopy Belief Propagation - Local Stability Analysis for Non-Vanishing Fields
    Knoll, Christian
    Pernkopf, Franz
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI2017), 2017,
  • [42] Inverse Problem in Pairwise Markov Random Fields Using Loopy Belief Propagation
    Yasuda, Muneki
    Kataoka, Shun
    Tanaka, Kazuyuki
    JOURNAL OF THE PHYSICAL SOCIETY OF JAPAN, 2012, 81 (04)
  • [43] Image segmentation via ant colony algorithm and loopy belief propagation algorithm
    Xu Shengjun
    Liu Guanghui
    Liu Xin
    2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [44] Fast and Scalable Distributed Loopy Belief Propagation on Real-World Graphs
    Jo, Saehan
    Yoo, Jaemin
    Kang, U.
    WSDM'18: PROCEEDINGS OF THE ELEVENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2018, : 297 - 305
  • [45] HYPERSPECTRAL IMAGE CLASSIFICATION USING MULTILAYER SUPERPIXEL GRAPH AND LOOPY BELIEF PROPAGATION
    Zhan, Tianming
    Xu, Yang
    Sun, Le
    Wu, Zebin
    Zhan, Yongzhao
    2015 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2015, : 1690 - 1693
  • [46] SINGLE-CHANNEL SPEECH SEPARATION AND RECOGNITION USING LOOPY BELIEF PROPAGATION
    Rennie, Steven J.
    Hershey, John R.
    Olsen, Peder A.
    2009 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS 1- 8, PROCEEDINGS, 2009, : 3845 - 3848
  • [47] Large-Scale and Efficient Texture Mapping Algorithm via Loopy Belief Propagation
    Ling, Xiao
    Qin, Rongjun
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [48] First-order phase transition and Bayesian image processing by loopy belief propagation
    Tanaka, K
    Titterington, DM
    PROGRESS OF THEORETICAL PHYSICS SUPPLEMENT, 2005, (157): : 288 - 291
  • [49] DECENTRALIZED SUPPLY CHAIN FORMATION USING MAX-SUM LOOPY BELIEF PROPAGATION
    Winsper, Michael
    Chli, Maria
    COMPUTATIONAL INTELLIGENCE, 2013, 29 (02) : 281 - 309
  • [50] BAYESIAN REGULARIZATION OF DIFFUSION TENSOR IMAGES USING HIERARCHICAL MCMC AND LOOPY BELIEF PROPAGATION
    Wei, Siming
    Hua, Jing
    Bu, Jiajun
    Chen, Chun
    Yu, Yizhou
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 65 - 68