Conditioning and time representation in long short-term memory networks

被引:9
|
作者
Rivest, Francois [1 ,2 ]
Kalaska, John F. [3 ]
Bengio, Yoshua [4 ]
机构
[1] Royal Mil Coll Canada, Dept Math & Comp Sci, Stn Forces, Kingston, ON K7K 7B4, Canada
[2] Queens Univ, Ctr Neurosci Studies, Kingston, ON, Canada
[3] Univ Montreal, Dept Physiol, Montreal, PQ H3C 3J7, Canada
[4] Univ Montreal, Dept Comp Sci & Operat Res, Montreal, PQ, Canada
关键词
Time representation learning; Temporal-difference learning; Long short-term memory networks; Dopamine; Conditioning; Reinforcement learning; PARAMETRIC WORKING-MEMORY; MONKEY DOPAMINE NEURONS; REWARD-PREDICTION; PREMOTOR CORTEX; MODEL; RESPONSES; HIPPOCAMPUS; INTERVALS; DYNAMICS; STIMULUS;
D O I
10.1007/s00422-013-0575-1
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Dopaminergic models based on the temporal-difference learning algorithm usually do not differentiate trace from delay conditioning. Instead, they use a fixed temporal representation of elapsed time since conditioned stimulus onset. Recently, a new model was proposed in which timing is learned within a long short-term memory (LSTM) artificial neural network representing the cerebral cortex (Rivest et al. in J Comput Neurosci 28(1):107-130, 2010). In this paper, that model's ability to reproduce and explain relevant data, as well as its ability to make interesting new predictions, are evaluated. The model reveals a strikingly different temporal representation between trace and delay conditioning since trace conditioning requires working memory to remember the past conditioned stimulus while delay conditioning does not. On the other hand, the model predicts no important difference in DA responses between those two conditions when trained on one conditioning paradigm and tested on the other. The model predicts that in trace conditioning, animal timing starts with the conditioned stimulus offset as opposed to its onset. In classical conditioning, it predicts that if the conditioned stimulus does not disappear after the reward, the animal may expect a second reward. Finally, the last simulation reveals that the buildup of activity of some units in the networks can adapt to new delays by adjusting their rate of integration. Most importantly, the paper shows that it is possible, with the proposed architecture, to acquire discharge patterns similar to those observed in dopaminergic neurons and in the cerebral cortex on those tasks simply by minimizing a predictive cost function.
引用
收藏
页码:23 / 48
页数:26
相关论文
共 50 条
  • [41] Short-term Load Forecasting with Distributed Long Short-Term Memory
    Dong, Yi
    Chen, Yang
    Zhao, Xingyu
    Huang, Xiaowei
    2023 IEEE POWER & ENERGY SOCIETY INNOVATIVE SMART GRID TECHNOLOGIES CONFERENCE, ISGT, 2023,
  • [42] A short-term prediction model of global ionospheric VTEC based on the combination of long short-term memory and convolutional long short-term memory
    Peng Chen
    Rong Wang
    Yibin Yao
    Hao Chen
    Zhihao Wang
    Zhiyuan An
    Journal of Geodesy, 2023, 97
  • [43] A short-term prediction model of global ionospheric VTEC based on the combination of long short-term memory and convolutional long short-term memory
    Chen, Peng
    Wang, Rong
    Yao, Yibin
    Chen, Hao
    Wang, Zhihao
    An, Zhiyuan
    JOURNAL OF GEODESY, 2023, 97 (05)
  • [44] Short-term Individual Electric Vehicle Charging Behavior Prediction Using Long Short-term Memory Networks
    Khwaja, Ahmed S.
    Venkatesh, Bala
    Anpalagan, Alagan
    2020 IEEE 25TH INTERNATIONAL WORKSHOP ON COMPUTER AIDED MODELING AND DESIGN OF COMMUNICATION LINKS AND NETWORKS (CAMAD), 2020,
  • [45] QUANTUM LONG SHORT-TERM MEMORY
    Chen, Samuel Yen-Chi
    Yoo, Shinjae
    Fang, Yao-Lung L.
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8622 - 8626
  • [46] LIPREADING WITH LONG SHORT-TERM MEMORY
    Wand, Michael
    Koutnik, Jan
    Schmidhuber, Jurgen
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 6115 - 6119
  • [47] Associative Long Short-Term Memory
    Danihelka, Ivo
    Wayne, Greg
    Uria, Benigno
    Kalchbrenner, Nal
    Graves, Alex
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [48] Combining fuzzy clustering and improved long short-term memory neural networks for short-term load forecasting
    Liu, Fu
    Dong, Tian
    Liu, Qiaoliang
    Liu, Yun
    Li, Shoutao
    ELECTRIC POWER SYSTEMS RESEARCH, 2024, 226
  • [49] REACTION-TIME IN SHORT-TERM AND LONG-TERM RECOGNITION MEMORY
    RAZEL, CS
    JOURNAL OF PSYCHOLINGUISTIC RESEARCH, 1975, 4 (03) : 279 - 280
  • [50] Early Prediction of Pressure Injury with Long Short-term Memory Networks
    Fang, Xudong
    Wang, Yunfeng
    Maeda, Ryutaro
    Kitayama, Akio
    Takashi, En
    SENSORS AND MATERIALS, 2022, 34 (07) : 2759 - 2769