Hebbian learning of context in recurrent neural networks

被引:55
|
作者
Brunel, N [1 ]
机构
[1] UNIV ROMA LA SAPIENZA,INST FIS,IST NAZL FIS NUCL,I-00185 ROME,ITALY
关键词
D O I
10.1162/neco.1996.8.8.1677
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Single electrode recordings in the inferotemporal cortex of monkeys during delayed visual memory tasks provide evidence for attractor dynamics in the observed region. The persistent elevated delay activities could be internal representations of features of the learned visual stimuli shown to the monkey during training. When uncorrelated stimuli are presented during training in a fixed sequence, these experiments display significant correlations between the internal representations. Recently a simple model of attractor neural network has reproduced quantitatively the measured correlations. An underlying assumption of the model is that the synaptic matrix formed during the training phase contains in its efficacies information about the contiguity of persistent stimuli in the training sequence. We present here a simple unsupervised learning dynamics that produces such a synaptic matrix if sequences of stimuli are repeatedly presented to the network at fixed order. The resulting matrix is then shown to convert temporal correlations during training into spatial correlations between attractors. The scenario is that, in the presence of selective delay activity, at the presentation of each stimulus, the activity distribution in the neural assembly contains information of both the current stimulus and the previous one (carried by the attractor). Thus the recurrent synaptic matrix can code not only for each of the stimuli presented to the network but also for their context. We combine the idea that for learning to be effective, synaptic modification should be stochastic, with the fact that attractors provide learnable information about two consecutive stimuli. We calculate explicitly the probability distribution of synaptic efficacies as a function of training protocol, that is, the order in which stimuli are presented to the network. We then solve for the dynamics of a network composed of integrate-and-fire excitatory and inhibitory neurons with a matrix of synaptic collaterals resulting from the learning dynamics. The network has a stable spontaneous activity, and stable delay activity develops after a critical learning stage. The availability of a learning dynamics makes possible a number of experimental predictions for the dependence of the delay activity distributions and the correlations between them, on the learning stage and the learning protocol. In particular it makes specific predictions for pair-associates delay experiments.
引用
收藏
页码:1677 / 1710
页数:34
相关论文
共 50 条
  • [1] An analysis of the use of Hebbian and Anti-Hebbian spike time dependent plasticity learning functions within the context of recurrent spiking neural networks
    Carnell, Andrew
    NEUROCOMPUTING, 2009, 72 (4-6) : 685 - 692
  • [2] Learning to Generate Sequences with Combination of Hebbian and Non-hebbian Plasticity in Recurrent Spiking Neural Networks
    Panda, Priyadarshini
    Roy, Kaushik
    FRONTIERS IN NEUROSCIENCE, 2017, 11
  • [3] The road to chaos by time-asymmetric Hebbian learning in recurrent neural networks
    Molter, Colin
    Salihoglu, Utku
    Bersini, Hugues
    NEURAL COMPUTATION, 2007, 19 (01) : 80 - 110
  • [4] Unsupervised Hebbian learning in neural networks
    Freisleben, B
    Hagen, C
    COMPUTING ANTICIPATORY SYSTEMS: CASYS - FIRST INTERNATIONAL CONFERENCE, 1998, 437 : 606 - 625
  • [5] Modular neural networks with Hebbian learning rule
    Goltsev, Alexander
    Gritsenko, Vladimir
    NEUROCOMPUTING, 2009, 72 (10-12) : 2477 - 2482
  • [6] Contrastive Hebbian Feedforward Learning for Neural Networks
    Kermiche, Noureddine
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (06) : 2118 - 2128
  • [7] Learning context-free grammars with recurrent neural networks
    Harada, T
    Araki, O
    Sakurai, A
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 2602 - 2607
  • [8] Contraction Analysis of Hopfield Neural Networks with Hebbian Learning
    Centorrino, Veronica
    Bullo, Francesco
    Russo, Giovanni
    2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 622 - 627
  • [9] Hebbian Learning Meets Deep Convolutional Neural Networks
    Amato, Giuseppe
    Carrara, Fabio
    Falchi, Fabrizio
    Gennaro, Claudio
    Lagani, Gabriele
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2019, PT I, 2019, 11751 : 324 - 334
  • [10] Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network
    Brunel, N
    Carusi, F
    Fusi, S
    NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1998, 9 (01) : 123 - 152