Using Stigmergy as a Computational Memory in the Design of Recurrent Neural Networks

被引:2
|
作者
Galatolo, Federico A. [1 ]
Cimino, Mario G. C. A. [1 ]
Vaglini, Gigliola [1 ]
机构
[1] Univ Pisa, Dept Informat Engn, I-56122 Pisa, Italy
关键词
Artificial Neural Networks; Recurrent Neural Network; Stigmergy; Deep Learning; Supervised Learning;
D O I
10.5220/0007581508300836
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a novel architecture of Recurrent Neural Network (RNN) is designed and experimented. The proposed RNN adopts a computational memory based on the concept of stigmergy. The basic principle of a Stigmergic Memory (SM) is that the activity of deposit/removal of a quantity in the SM stimulates the next activities of deposit/removal. Accordingly, subsequent SM activities tend to reinforce/weaken each other, generating a coherent coordination between the SM activities and the input temporal stimulus. We show that, in a problem of supervised classification, the SM encodes the temporal input in an emergent representational model, by coordinating the deposit, removal and classification activities. This study lays down a basic framework for the derivation of a SM-RNN. A formal ontology of SM is discussed, and the SM-RNN architecture is detailed. To appreciate the computational power of an SM-RNN, comparative NNs have been selected and trained to solve the MNIST handwritten digits recognition benchmark in its two variants: spatial (sequences of bitmap rows) and temporal (sequences of pen strokes).
引用
收藏
页码:830 / 836
页数:7
相关论文
共 50 条
  • [41] Network of Recurrent Neural Networks: Design for Emergence
    Wang, Chaoming
    Zeng, Yi
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT II, 2018, 11302 : 89 - 102
  • [42] Using Recurrent Neural Networks for Decompilation
    Katz, Deborah S.
    Ruchti, Jason
    Schulte, Eric
    2018 25TH IEEE INTERNATIONAL CONFERENCE ON SOFTWARE ANALYSIS, EVOLUTION AND REENGINEERING (SANER 2018), 2018, : 346 - 356
  • [43] Computational acceleration using neural networks
    Cadaret, Paul
    SENSORS, AND COMMAND, CONTROL, COMMUNICATIONS, AND INTELLIGENCE (C3I) TECHNOLOGIES FOR HOMELAND SECURITY AND HOMELAND DEFENSE VII, 2008, 6943
  • [44] Reconstructing computational system dynamics from neural data with recurrent neural networks
    Daniel Durstewitz
    Georgia Koppe
    Max Ingo Thurm
    Nature Reviews Neuroscience, 2023, 24 : 693 - 710
  • [45] Delay and Recurrent Neural Networks: Computational Cybernetics of Systems Biology?
    Dimirovski, Georgi M.
    Wang, Rui
    Yang, Bin
    2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 906 - 911
  • [46] Computational Capabilities of Recurrent Neural Networks Based on their Attractor Dynamics
    Cabessa, Jeremie
    Villa, Alessandro E. P.
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [47] Reconstructing computational system dynamics from neural data with recurrent neural networks
    Durstewitz, Daniel
    Koppe, Georgia
    Thurm, Max Ingo
    NATURE REVIEWS NEUROSCIENCE, 2023, 24 (11) : 693 - 710
  • [48] Android Malware Detection Using Long Short Term Memory Recurrent Neural Networks
    Georgieva, Lilia
    Lamarque, Basile
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON APPLIED CYBER SECURITY (ACS) 2021, 2022, 378 : 42 - 52
  • [49] Wind Speed Forecasting Using Recurrent Neural Networks and Long Short Term Memory
    Ningsih, Fitriana R.
    Djamal, Esmeralda C.
    Najmurrakhman, Asep
    PROCEEDINGS OF THE 2019 6TH INTERNATIONAL CONFERENCE ON INSTRUMENTATION, CONTROL, AND AUTOMATION (ICA), 2019, : 137 - 141
  • [50] Design of a Photonic Crystal Exclusive-OR Gate Using Recurrent Neural Networks
    Karami, Pouya
    Yahya, Salah I.
    Chaudhary, Muhammad Akmal
    Assaad, Maher
    Roshani, Saeed
    Hazzazi, Fawwaz
    Parandin, Fariborz
    Roshani, Sobhan
    SYMMETRY-BASEL, 2024, 16 (07):