Two-Layer Feedback Neural Networks with Associative Memories

被引:0
|
作者
Wu Gui-Kun [1 ]
Zhao Hong
机构
[1] Xiamen Univ, Dept Phys, Xiamen 361005, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We construct a two-layer feedback neural network by a Monte Carlo based algorithm to store memories as fixed-point attractors or as limit-cycle attractors. Special attention is focused on comparing the dynamics of the network with limit-cycle attractors and with fixed-point attractors. It is found that the former has better retrieval property than the latter. Particularly, spurious memories may be suppressed completely when the memories are stored as a long-limit cycle. Potential application of limit-cycle-attractor networks is discussed briefly.
引用
收藏
页码:3871 / 3874
页数:4
相关论文
共 50 条
  • [41] An online gradient method with momentum for two-layer feedforward neural networks
    Zhang, Naimin
    APPLIED MATHEMATICS AND COMPUTATION, 2009, 212 (02) : 488 - 498
  • [42] Convergence of a Gradient Algorithm with Penalty for Training Two-layer Neural Networks
    Shao, Hongmei
    Liu, Lijun
    Zheng, Gaofeng
    2009 2ND IEEE INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND INFORMATION TECHNOLOGY, VOL 4, 2009, : 76 - +
  • [43] Convergence of gradient method with momentum for two-layer feedforward neural networks
    Zhang, NM
    Wu, W
    Zheng, GF
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (02): : 522 - 525
  • [44] The VC dimension and pseudodimension of two-layer neural networks with discrete inputs
    Bartlett, PL
    Williamson, RC
    NEURAL COMPUTATION, 1996, 8 (03) : 625 - 628
  • [45] Spurious Local Minima are Common in Two-Layer ReLU Neural Networks
    Safran, Itay
    Shamir, Ohad
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [46] A New Propagator for Two-Layer Neural Networks in Empirical Model Learning
    Lombardi, Michele
    Gualandi, Stefano
    PRINCIPLES AND PRACTICE OF CONSTRAINT PROGRAMMING, CP 2013, 2013, 8124 : 448 - 463
  • [47] Fast Convergence in Learning Two-Layer Neural Networks with Separable Data
    Taheri, Hossein
    Thrampoulidis, Christos
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9944 - 9952
  • [48] Fast and Provable Algorithms for Learning Two-Layer Polynomial Neural Networks
    Soltani, Mohammadreza
    Hegde, Chinmay
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (13) : 3361 - 3371
  • [49] Feedback neural nets for associative memories - An overview and some new results
    Liu, DR
    PROCEEDINGS OF THE 1996 IEEE INTERNATIONAL SYMPOSIUM ON INTELLIGENT CONTROL, 1996, : 474 - 479
  • [50] Functional neural networks underlying semantic encoding of associative memories
    Crespo-Garcia, M.
    Cantero, J. L.
    Pomyalov, A.
    Boccaletti, S.
    Atienza, M.
    NEUROIMAGE, 2010, 50 (03) : 1258 - 1270