Associative memory by recurrent neural networks with delay elements

被引:10
|
作者
Miyoshi, S
Yanai, HF
Okada, M
机构
[1] Kobe City Coll Technol, Dept Elect Engn, Nishi Ku, Kobe, Hyogo 6512194, Japan
[2] Ibaraki Univ, Fac Engn, Dept Media & Telecommun, Hitachi, Ibaraki 3168511, Japan
[3] Japan Sci & Technol Corp, Exploratory Res Adv Technol, Kyoto 6190288, Japan
[4] RIKEN Brain Sci Inst, Lab Math Neurosci, Wako, Saitama 3510198, Japan
关键词
sequential associative memory; neural network; delay; statistical neurodynamics;
D O I
10.1016/S0893-6080(03)00207-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The synapses of real neural systems seem to have delays. Therefore, it is worthwhile to analyze associative memory models with delayed synapses. Thus, a sequential associative memory model with delayed synapses is discussed, where a discrete synchronous updating rule and a correlation learning rule are employed. Its dynamic properties are analyzed by the statistical neurodynamics. In this paper, we first re-derive the Yanai-Kim theory, which involves macrodynamical equations for the dynamics of the network with serial delay elements. Since their theory needs a computational complexity of O(L(4)t) to obtain the macroscopic state at time step I where L is the length of delay, it is intractable to discuss the macroscopic properties for a large L limit. Thus, we derive steady state equations using the discrete Fourier transformation, where the computational complexity does not formally depend on L. We show that the storage capacity alpha(C) is in proportion to the delay length L with a large L limit, and the proportion constant is 0.195, i.e. alpha(C) = 0.195L. These results are supported by computer simulations. (C) 2003 Elsevier Ltd. All rights reserved.
引用
收藏
页码:55 / 63
页数:9
相关论文
共 50 条