Latent attractors: A general paradigm for context-dependent neural computation

被引:0
|
作者
Doboli, Simona [1 ]
Minai, Ali A. [2 ]
机构
[1] Hofstra Univ, Dept Comp Sci, Hempstead, NY 11549 USA
[2] Univ Cincinnati, Dept Elect & Comp Engn & Com Sci, Cincinnati, OH 45221 USA
来源
TRENDS IN NEURAL COMPUTATION | 2007年 / 35卷
基金
美国国家科学基金会;
关键词
attractor networks; recurrent networks; context; sequence learning; modular networks; multi-scale dynamics;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Context is an essential part of all cognitive function. However, neural network models have only considered this issue in limited ways, focusing primarily on the conditioning of a system's response by its recent history. This type of context, which we term Type I, is clearly relevant in many situations; but in other cases; the system's response for an extended period must be conditioned by stimuli encountered at a specific earlier time. For example, the decision to turn left or right at an intersection point in a navigation task depends on the goal set at the beginning of the task. We term this type of context, which sets the "frame of reference" for an entire episode, Type II context. The prefrontal cortex in mammals has been hypothesized to perform this function, but it has been difficult to incorporate this into neural network models. In the present chapter; we describe an approach called latent attractors that allows self-organizing neural systems to simultaneously incorporate both Type I and Type II context dependency. We demonstrate this by applying the approach to a series of problems requiring one or both types of context. We also argue that the latent attractor approach is a. general and flexible method for incorporating multi-scale temporal dependence into neural systems, and possibly other self-organized network models.
引用
收藏
页码:135 / +
页数:9
相关论文
共 50 条
  • [21] CONTEXT-DEPENDENT LATENT INHIBITION IN TASTE-AVERSION LEARNING
    ARCHER, T
    MOHAMMED, AK
    JARBE, TUC
    SCANDINAVIAN JOURNAL OF PSYCHOLOGY, 1986, 27 (03) : 277 - 284
  • [22] The convergence paradigm, ecological modelling, and context-dependent pitch perception
    Leman, M
    JOURNAL OF NEW MUSIC RESEARCH, 1997, 26 (02) : 133 - 153
  • [23] Continual learning of context-dependent processing in neural networks
    Zeng, Guanxiong
    Chen, Yang
    Cui, Bo
    Yu, Shan
    NATURE MACHINE INTELLIGENCE, 2019, 1 (08) : 364 - 372
  • [24] Neural circuits for learning context-dependent associations of stimuli
    Zhu, Henghui
    Paschalidis, Ioannis Ch
    Hasselmo, Michael E.
    NEURAL NETWORKS, 2018, 107 : 48 - 60
  • [25] Context-dependent word representation for neural machine translation
    Choi, Heeyoul
    Cho, Kyunghyun
    Bengio, Yoshua
    COMPUTER SPEECH AND LANGUAGE, 2017, 45 : 149 - 160
  • [26] Neural substrates mediating context-dependent sensitization to psychostimulants
    Nakamura, M
    Bell, K
    Cornish, JL
    Kalivas, PW
    PSYCHOBIOLOGY, 1999, 27 (02) : 287 - 291
  • [27] Continual learning of context-dependent processing in neural networks
    Guanxiong Zeng
    Yang Chen
    Bo Cui
    Shan Yu
    Nature Machine Intelligence, 2019, 1 : 364 - 372
  • [28] A Library-Based Approach to Context-Dependent Computation with Reactive Values
    Inoue, Hiroaki
    Igarashi, Atsushi
    COMPANION PROCEEDINGS OF THE 15TH INTERNATIONAL CONFERENCE ON MODULARITY (MODULARITY COMPANION'16), 2016, : 50 - 54
  • [29] A geometric framework for understanding dynamic information integration in context-dependent computation
    Zhang, Xiaohan
    Liu, Shenquan
    Chen, Zhe Sage
    ISCIENCE, 2021, 24 (08)
  • [30] Neural correlates of context-independent and context-dependent self-knowledge
    Martial, Charlotte
    Stawarczyk, David
    D'Argembeau, Arnaud
    BRAIN AND COGNITION, 2018, 125 : 23 - 31