Hebbian learning of context in recurrent neural networks

被引:55
|
作者
Brunel, N [1 ]
机构
[1] UNIV ROMA LA SAPIENZA,INST FIS,IST NAZL FIS NUCL,I-00185 ROME,ITALY
关键词
D O I
10.1162/neco.1996.8.8.1677
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Single electrode recordings in the inferotemporal cortex of monkeys during delayed visual memory tasks provide evidence for attractor dynamics in the observed region. The persistent elevated delay activities could be internal representations of features of the learned visual stimuli shown to the monkey during training. When uncorrelated stimuli are presented during training in a fixed sequence, these experiments display significant correlations between the internal representations. Recently a simple model of attractor neural network has reproduced quantitatively the measured correlations. An underlying assumption of the model is that the synaptic matrix formed during the training phase contains in its efficacies information about the contiguity of persistent stimuli in the training sequence. We present here a simple unsupervised learning dynamics that produces such a synaptic matrix if sequences of stimuli are repeatedly presented to the network at fixed order. The resulting matrix is then shown to convert temporal correlations during training into spatial correlations between attractors. The scenario is that, in the presence of selective delay activity, at the presentation of each stimulus, the activity distribution in the neural assembly contains information of both the current stimulus and the previous one (carried by the attractor). Thus the recurrent synaptic matrix can code not only for each of the stimuli presented to the network but also for their context. We combine the idea that for learning to be effective, synaptic modification should be stochastic, with the fact that attractors provide learnable information about two consecutive stimuli. We calculate explicitly the probability distribution of synaptic efficacies as a function of training protocol, that is, the order in which stimuli are presented to the network. We then solve for the dynamics of a network composed of integrate-and-fire excitatory and inhibitory neurons with a matrix of synaptic collaterals resulting from the learning dynamics. The network has a stable spontaneous activity, and stable delay activity develops after a critical learning stage. The availability of a learning dynamics makes possible a number of experimental predictions for the dependence of the delay activity distributions and the correlations between them, on the learning stage and the learning protocol. In particular it makes specific predictions for pair-associates delay experiments.
引用
收藏
页码:1677 / 1710
页数:34
相关论文
共 50 条
  • [21] Implementation Challenges and Strategies for Hebbian Learning in Convolutional Neural Networks
    Demidovskij, A. V.
    Kazyulina, M. S.
    Salnikov, I. G.
    Tugaryov, A. M.
    Trutnev, A. I.
    Pavlov, S. V.
    OPTICAL MEMORY AND NEURAL NETWORKS, 2023, 32 (Suppl 2) : S252 - S264
  • [22] Emergence of symmetric, modular, and reciprocal connections in recurrent networks with Hebbian learning
    Sherwin E. Hua
    James C. Houk
    Ferdinando A. Mussa-Ivaldi
    Biological Cybernetics, 1999, 81 : 211 - 225
  • [23] Configuring interaction of memorized patterns with an asymmetric Hebbian rule for recurrent neural networks
    Ishii, T
    Kyuma, K
    NEUROCOMPUTING, 1996, 10 (01) : 43 - 53
  • [24] Learning Queuing Networks by Recurrent Neural Networks
    Garbi, Giulio
    Incerto, Emilio
    Tribastone, Mirco
    PROCEEDINGS OF THE ACM/SPEC INTERNATIONAL CONFERENCE ON PERFORMANCE ENGINEERING (ICPE'20), 2020, : 56 - 66
  • [25] Mapping Hebbian Learning Rules to Coupling Resistances for Oscillatory Neural Networks
    Delacour, Corentin
    Todri-Sanial, Aida
    FRONTIERS IN NEUROSCIENCE, 2021, 15
  • [26] Dense Hebbian neural networks: A replica symmetric picture of supervised learning
    Agliari E.
    Albanese L.
    Alemanno F.
    Alessandrelli A.
    Barra A.
    Giannotti F.
    Lotito D.
    Pedreschi D.
    Physica A: Statistical Mechanics and its Applications, 2023, 626
  • [27] A Mathematical Analysis of the Effects of Hebbian Learning Rules on the Dynamics and Structure of Discrete-Time Random Recurrent Neural Networks
    Siri, Benoit
    Berry, Hugues
    Cessac, Bruno
    Delord, Bruno
    Quoy, Mathias
    NEURAL COMPUTATION, 2008, 20 (12) : 2937 - 2966
  • [28] Hebbian learning using fixed weight evolved dynamical 'Neural' networks
    Izquierdo-Torres, Eduardo
    Harvey, Inman
    2007 IEEE SYMPOSIUM ON ARTIFICIAL LIFE, 2006, : 394 - +
  • [29] Dense Hebbian neural networks: A replica symmetric picture of unsupervised learning
    Agliari, Elena
    Albanese, Linda
    Alemanno, Francesco
    Alessandrelli, Andrea
    Barra, Adriano
    Giannotti, Fosca
    Lotito, Daniele
    Pedreschi, Dino
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2023, 627
  • [30] Learning with recurrent neural networks - Conclusion
    不详
    LEARNING WITH RECURRENT NEURAL NETWORKS, 2000, 254 : 133 - 135