Bounding the Bias of Contrastive Divergence Learning

被引:37
|
作者
Fischer, Asja [1 ]
Igel, Christian [2 ]
机构
[1] Ruhr Univ Bochum, Inst Neuroinformat, D-44780 Bochum, Germany
[2] Univ Copenhagen, Dept Comp Sci, DK-2100 Copenhagen O, Denmark
关键词
D O I
10.1162/NECO_a_00085
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Optimization based on k-step contrastive divergence (CD) has become a common way to train restricted Boltzmann machines (RBMs). The k-step CD is a biased estimator of the log-likelihood gradient relying on Gibbs sampling. We derive a new upper bound for this bias. Its magnitude depends on k, the number of variables in the RBM, and the maximum change in energy that can be produced by changing a single variable. The last reflects the dependence on the absolute values of the RBM parameters. The magnitude of the bias is also affected by the distance in variation between the modeled distribution and the starting distribution of the Gibbs chain.
引用
收藏
页码:664 / 673
页数:10
相关论文
共 50 条
  • [1] Adiabatic Persistent Contrastive Divergence Learning
    Jang, Hyeryung
    Choi, Hyungwon
    Yi, Yung
    Shin, Jinwoo
    2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017,
  • [2] Contrastive Divergence Learning for the Restricted Boltzmann Machine
    Liu, Jian-Wei
    Chi, Guang-Hui
    Luo, Xiong-Lin
    2013 NINTH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION (ICNC), 2013, : 18 - 22
  • [3] Contrastive Divergence Learning with Chained Belief Propagation
    Ding, Fan
    Xue, Yexiang
    INTERNATIONAL CONFERENCE ON PROBABILISTIC GRAPHICAL MODELS, VOL 138, 2020, 138 : 161 - 172
  • [4] RenyiCL: Contrastive Representation Learning with Skew Renyi Divergence
    Lee, Kyungmin
    Shin, Jinwoo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [5] On-chip contrastive divergence learning in analogue VLSI
    Fleury, P
    Chen, H
    Murray, AF
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 1723 - 1728
  • [6] Latent Bias Mitigation via Contrastive Learning
    Gao, Yue
    Zhang, Shu
    Sun, Jun
    Yu, Shanshan
    Yoshii, Akihito
    PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE IN ELECTRONICS ENGINEERING, AIEE 2024, 2024, : 42 - 47
  • [7] Pipelined parallel contrastive divergence for continuous generative model learning
    Pedroni, Bruno U.
    Sheik, Sadique
    Cauwenberghs, Gert
    2017 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2017, : 212 - 215
  • [8] Learning about protein hydrogen bonding by minimizing contrastive divergence
    Podtelezhnikov, Alexei A.
    Ghahramani, Zoubin
    Wild, David L.
    PROTEINS-STRUCTURE FUNCTION AND BIOINFORMATICS, 2007, 66 (03) : 588 - 599
  • [9] Weighted contrastive divergence
    Romero, Enrique
    Mazzanti, Ferran
    Delgado, Jordi
    Buchaca, David
    NEURAL NETWORKS, 2019, 114 : 147 - 156
  • [10] Unbiased Classification Through Bias-Contrastive and Bias-Balanced Learning
    Hong, Youngkyu
    Yang, Eunho
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,