Context-Dependent Encoding of Fear and Extinction Memories in a Large-Scale Network Model of the Basal Amygdala

被引:36
|
作者
Vlachos, Ioannis [1 ]
Herry, Cyril [2 ,3 ]
Luethi, Andreas [4 ]
Aertsen, Ad [1 ,5 ]
Kumar, Arvind [1 ,5 ]
机构
[1] Bernstein Ctr Computat Neurosci Frieburg, Freiburg, Germany
[2] Neuroctr Magendie, Bordeaux, France
[3] INSERM, U862, Bordeaux, France
[4] Friedrich Miescher Inst Biomed Res, Basel, Switzerland
[5] Univ Freiburg, Fac Biol, Dept Neurobiol & Biophys, Freiburg, Germany
基金
瑞士国家科学基金会;
关键词
LONG-TERM POTENTIATION; EXCITATORY SYNAPTIC-TRANSMISSION; SYNCHRONIZED GAMMA-OSCILLATIONS; RAT BASOLATERAL AMYGDALA; CONDITIONED FEAR; LATERAL AMYGDALA; VISUAL-CORTEX; INTERNEURON NETWORKS; HIPPOCAMPAL INACTIVATION; DOPAMINERGIC INNERVATION;
D O I
10.1371/journal.pcbi.1001104
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
The basal nucleus of the amygdala (BA) is involved in the formation of context-dependent conditioned fear and extinction memories. To understand the underlying neural mechanisms we developed a large-scale neuron network model of the BA, composed of excitatory and inhibitory leaky-integrate-and-fire neurons. Excitatory BA neurons received conditioned stimulus (CS)-related input from the adjacent lateral nucleus (LA) and contextual input from the hippocampus or medial prefrontal cortex (mPFC). We implemented a plasticity mechanism according to which CS and contextual synapses were potentiated if CS and contextual inputs temporally coincided on the afferents of the excitatory neurons. Our simulations revealed a differential recruitment of two distinct subpopulations of BA neurons during conditioning and extinction, mimicking the activation of experimentally observed cell populations. We propose that these two subgroups encode contextual specificity of fear and extinction memories, respectively. Mutual competition between them, mediated by feedback inhibition and driven by contextual inputs, regulates the activity in the central amygdala (CEA) thereby controlling amygdala output and fear behavior. The model makes multiple testable predictions that may advance our understanding of fear and extinction memories.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Encoding Spatial Context for Large-Scale Partial-Duplicate Web Image Retrieval
    Wen-Gang Zhou
    Hou-Qiang Li
    Yijuan Lu
    Qi Tian
    Journal of Computer Science and Technology, 2014, 29 : 837 - 848
  • [42] Encoding Spatial Context for Large-Scale Partial-Duplicate Web Image Retrieval
    周文罡
    李厚强
    卢亦娟
    田奇
    Journal of Computer Science & Technology, 2014, 29 (05) : 837 - 848
  • [43] When gut feelings teach the brain to fear pain: Context-dependent activation of the central fear network in a novel interoceptive conditioning paradigm
    Icenhour, Adriane
    Petrakova, Liubov
    Hazzan, Nelly
    Theysohn, Nina
    Merz, Christian J.
    Elsenbruch, Sigrid
    NEUROIMAGE, 2021, 238
  • [44] Toward Large-Scale Face Recognition Using Social Network Context
    Stone, Zak
    Zickler, Todd
    Darrell, Trevor
    PROCEEDINGS OF THE IEEE, 2010, 98 (08) : 1408 - 1415
  • [45] Distributed intelligent network management model for the large-scale computer network
    Luo, Junzhou
    Li, Wei
    Liu, Bo
    COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN II, 2006, 3865 : 313 - 323
  • [46] Global context-dependent recurrent neural network language model with sparse feature learning
    Deng, Hongli
    Zhang, Lei
    Wang, Lituan
    NEURAL COMPUTING & APPLICATIONS, 2019, 31 (Suppl 2): : 999 - 1011
  • [47] Hippocampal Contributions to the Large-Scale Episodic Memory Network Predict Vivid Visual Memories
    Geib, Benjamin R.
    Stanley, Matthew L.
    Wing, Erik A.
    Laurienti, Paul J.
    Cabeza, Roberto
    CEREBRAL CORTEX, 2017, 27 (01) : 680 - 693
  • [48] Decoding Position to Analyze Spatial Information Encoding in a Large-Scale Neuronal Network Model of Rat Dentate Gyrus
    Yu, Gene J.
    Bouteiller, Jean-Marie C.
    Song, Dong
    Berger, Theodore W.
    2018 40TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2018, : 6137 - 6140
  • [49] Global context-dependent recurrent neural network language model with sparse feature learning
    Hongli Deng
    Lei Zhang
    Lituan Wang
    Neural Computing and Applications, 2019, 31 : 999 - 1011
  • [50] Deep Context: A Neural Language Model for Large-scale Networked Documents
    Wu, Hao
    Lerman, Kristina
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3091 - 3097