On capacity of memory in chaotic neural networks with incremental learning

被引:0
|
作者
Deguchi, Toshinori [1 ]
Matsuno, Keisuke [1 ]
Ishii, Naohiro [2 ]
机构
[1] Gifu Natl Coll Technol, Gifu, Japan
[2] Aichi Inst Technol, Aichi, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks are, able to learn more patterns with the incremental learning than with the correlative learning. The incremental learning is a method to compose an associate memory using a chaotic neural network. In former work, it was found that the capacity of the network increases along with its size, which is the number of the neurons in the network, until some threshold size and that it decreases over that size. The threshold size and the capacity varied between 2 different learning parameters. In this paper, the capacity of the networks was investigated changing the learning parameter. Through the computer simulations, it turned out that the capacity also increases in proportion to the network size in larger sizes and that the capacity of the network with the incremental learning is above 11 times larger than the one with correlative learning.
引用
收藏
页码:919 / +
页数:2
相关论文
共 50 条
  • [1] Capacity of Memory and Error Correction Capability in Chaotic Neural Networks with Incremental Learning
    Deguchi, Toshinori
    Matsuno, Keisuke
    Kimura, Toshiki
    Ishii, Naohiro
    COMPUTER AND INFORMATION SCIENCE 2009, 2009, 208 : 295 - 302
  • [2] On Capacity with Incremental Learning by Simplified Chaotic Neural Network
    Deguchi, Toshinori
    Ishii, Naohiro
    THEORY AND PRACTICE OF NATURAL COMPUTING (TPNC 2018), 2018, 11324 : 377 - 387
  • [3] Studies on the memory capacity and robustness of chaotic dynamic neural networks
    Beliaev, Igor
    Kozma, Robert
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 3991 - +
  • [4] MEMORY CAPACITY OF NEURAL NETWORKS LEARNING WITHIN BOUNDS
    GORDON, MB
    JOURNAL DE PHYSIQUE, 1987, 48 (12): : 2053 - 2058
  • [5] On Learning Parameters of Incremental Learning in Chaotic Neural Network
    Deguchi, Toshinori
    Ishii, Naohiro
    ENGINEERING APPLICATIONS OF NEURAL NETWORKS, EANN 2016, 2016, 629 : 241 - 252
  • [6] On Acceleration of Incremental Learning in Chaotic Neural Network
    Deguchi, Toshinori
    Takahashi, Toshiki
    Ishii, Naohiro
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, PT II, 2015, 9095 : 370 - 379
  • [7] On Simplification of Chaotic Neural Network on Incremental Learning
    Deguchi, Toshinori
    Takahashi, Toshiki
    Ishii, Naohiro
    2014 15TH IEEE/ACIS INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ARTIFICIAL INTELLIGENCE, NETWORKING AND PARALLEL/DISTRIBUTED COMPUTING (SNPD), 2014, : 203 - 206
  • [8] Memory Efficient Invertible Neural Networks for Class-Incremental Learning
    Hocquet, Guillaume
    Bichler, Olivier
    Querlioz, Damien
    2021 IEEE 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS), 2021,
  • [9] CHAOTIC NEURAL NETWORKS AND ASSOCIATIVE MEMORY
    IKEGUCHI, T
    ADACHI, M
    AIHARA, K
    LECTURE NOTES IN COMPUTER SCIENCE, 1991, 540 : 17 - 24
  • [10] On Temporal Summation in Chaotic Neural Network with Incremental Learning
    Deguchi, Toshinori
    Takahashi, Toshiki
    Ishii, Naohiro
    INTERNATIONAL JOURNAL OF SOFTWARE INNOVATION, 2014, 2 (04) : 72 - 84