Equivalence of restricted Boltzmann machines and tensor network states

被引:180
|
作者
Chen, Jing [1 ,2 ,4 ]
Cheng, Song [1 ,2 ]
Xie, Haidong [1 ,2 ]
Wang, Lei [1 ]
Xiang, Tao [1 ,3 ]
机构
[1] Chinese Acad Sci, Inst Phys, POB 603, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Sch Phys Sci, Beijing 100049, Peoples R China
[3] Collaborat Innovat Ctr Quantum Matter, Beijing 100190, Peoples R China
[4] Flatiron Inst, Ctr Computat Quantum Phys, New York, NY 10010 USA
基金
中国国家自然科学基金;
关键词
MATRIX RENORMALIZATION-GROUP; PRODUCT STATES; DEEP; ENTANGLEMENT; MODELS;
D O I
10.1103/PhysRevB.97.085104
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The restricted Boltzmann machine (RBM) is one of the fundamental building blocks of deep learning. RBM finds wide applications in dimensional reduction, feature extraction, and recommender systems via modeling the probability distributions of a variety of input data including natural images, speech signals, and customer ratings, etc. We build a bridge between RBM and tensor network states (TNS) widely used in quantum many-body physics research. We devise efficient algorithms to translate an RBM into the commonly used TNS. Conversely, we give sufficient and necessary conditions to determine whether a TNS can be transformed into an RBM of given architectures. Revealing these general and constructive connections can cross fertilize both deep learning and quantum many-body physics. Notably, by exploiting the entanglement entropy bound of TNS, we can rigorously quantify the expressive power of RBM on complex data sets. Insights into TNS and its entanglement capacity can guide the design of more powerful deep learning architectures. On the other hand, RBM can represent quantum many-body states with fewer parameters compared to TNS, which may allow more efficient classical simulations.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Time series forecasting using a deep belief network with restricted Boltzmann machines
    Kuremoto, Takashi
    Kimura, Shinsuke
    Kobayashi, Kunikazu
    Obayashi, Masanao
    NEUROCOMPUTING, 2014, 137 : 47 - 56
  • [42] Using a Recurrent Neural Network and Restricted Boltzmann Machines for Malicious Traffic Detection
    Li, Chaopeng
    Wang, Jinlin
    Ye, Xiaozhou
    NEUROQUANTOLOGY, 2018, 16 (05) : 823 - 831
  • [43] Towards the representational power of restricted Boltzmann machines
    Gu, Linyan
    Zhou, Feng
    Yang, Lihua
    NEUROCOMPUTING, 2020, 415 : 358 - 367
  • [44] Analyzing market baskets by restricted Boltzmann machines
    Harald Hruschka
    OR Spectrum, 2014, 36 : 209 - 228
  • [45] Spectral dynamics of learning in restricted Boltzmann machines
    Decelle, A.
    Fissore, G.
    Furtlehner, C.
    EPL, 2017, 119 (06)
  • [46] Restricted Boltzmann Machines as Models of Interacting Variables
    Bulso, Nicola
    Roudi, Yasser
    NEURAL COMPUTATION, 2021, 33 (10) : 2646 - 2681
  • [47] PHONE RECOGNITION USING RESTRICTED BOLTZMANN MACHINES
    Mohamed, Abdel-rahman
    Hinton, Geoffrey
    2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 4354 - 4357
  • [48] Identifying product order with restricted Boltzmann machines
    Rao, Wen-Jia
    Li, Zhenyu
    Zhu, Qiong
    Luo, Mingxing
    Wan, Xin
    PHYSICAL REVIEW B, 2018, 97 (09)
  • [49] The Streaming Approach to Training Restricted Boltzmann Machines
    Duda, Piotr
    Rutkowski, Leszek
    Woldan, Piotr
    Najgebauer, Patryk
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING (ICAISC 2021), PT I, 2021, 12854 : 308 - 317
  • [50] MODELING SPEECH PERCEPTION WITH RESTRICTED BOLTZMANN MACHINES
    Klein, Michael
    ten Bosch, Louis
    Boves, Lou
    CONNECTIONIST MODELS OD NEUROCOGNITION AND EMERGENT BEHAVIOR: FROM THEORY TO APPLICATIONS, 2012, 20 : 93 - 109