Contrastive divergence in Gaussian diffusions

被引:1
|
作者
Movellan, Javier R. [1 ]
机构
[1] Univ Calif San Diego, Inst Neural Computat, La Jolla, CA 92093 USA
关键词
D O I
10.1162/neco.2008.01-07-430
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This letter presents an analysis of the contrastive divergence (CD) learning algorithm when applied to continuous-time linear stochastic neural networks. For this case, powerful techniques exist that allow a detailed analysis of the behavior of CD. The analysis shows that CD converges to maximum likelihood solutions only when the network structure is such that it can match the first moments of the desired distribution. Otherwise, CD can converge to solutions arbitrarily different from the log-likelihood solutions, or they can even diverge. This result suggests the need to improve our theoretical understanding of the conditions under which CD is expected to be well behaved and the conditions under which it may fail. In, addition the results point to practical ideas on how to improve the performance of CD.
引用
收藏
页码:2238 / 2252
页数:15
相关论文
共 50 条