Task representations in neural networks trained to perform many cognitive tasks

被引:253
|
作者
Yang, Guangyu Robert [1 ,2 ]
Joglekar, Madhura R. [1 ,6 ]
Song, H. Francis [1 ,7 ]
Newsome, William T. [3 ,4 ]
Wang, Xiao-Jing [1 ,5 ]
机构
[1] NYU, Ctr Neural Sci, New York, NY 10003 USA
[2] Columbia Univ, Mortimer B Zuckerman Mind Brain Behav Inst, Dept Neurosci, New York, NY USA
[3] Stanford Univ, Dept Neurobiol, Stanford, CA 94305 USA
[4] Stanford Univ, Howard Hughes Med Inst, Stanford, CA 94305 USA
[5] Shanghai Res Ctr Brain Sci & Brain Inspired Intel, Shanghai, Peoples R China
[6] NYU, Courant Inst Math Sci, New York, NY USA
[7] DeepMind, London, England
基金
美国国家科学基金会;
关键词
PREFRONTAL CORTEX; WORKING-MEMORY; SELECTIVITY; MECHANISMS; DYNAMICS; NEURONS; MODELS; WINDOW;
D O I
10.1038/s41593-018-0310-2
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
The brain has the ability to flexibly perform many tasks, but the underlying mechanism cannot be elucidated in traditional experimental and modeling studies designed for one task at a time. Here, we trained single network models to perform 20 cognitive tasks that depend on working memory, decision making, categorization, and inhibitory control. We found that after training, recurrent units can develop into clusters that are functionally specialized for different cognitive processes, and we introduce a simple yet effective measure to quantify relationships between single-unit neural representations of tasks. Learning often gives rise to compositionality of task representations, a critical feature for cognitive flexibility, whereby one task can be performed by recombining instructions for other tasks. Finally, networks developed mixed task selectivity similar to recorded prefrontal neurons after learning multiple tasks sequentially with a continual-learning technique. This work provides a computational platform to investigate neural representations of many cognitive tasks.
引用
收藏
页码:297 / +
页数:14
相关论文
共 50 条
  • [1] Task representations in neural networks trained to perform many cognitive tasks
    Guangyu Robert Yang
    Madhura R. Joglekar
    H. Francis Song
    William T. Newsome
    Xiao-Jing Wang
    Nature Neuroscience, 2019, 22 : 297 - 306
  • [2] Abstract representations emerge naturally in neural networks trained to perform multiple tasks
    W. Jeffrey Johnston
    Stefano Fusi
    Nature Communications, 14 (1)
  • [3] Abstract representations emerge naturally in neural networks trained to perform multiple tasks
    Johnston, W. Jeffrey
    Fusi, Stefano
    NATURE COMMUNICATIONS, 2023, 14 (01)
  • [4] Transforming task representations to perform novel tasks
    Lampinen, Andrew K.
    McClelland, James L.
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2020, 117 (52) : 32970 - 32981
  • [5] Neural changes after training to perform cognitive tasks
    Qi, Xue-Lian
    Constantinidis, Christos
    BEHAVIOURAL BRAIN RESEARCH, 2013, 241 : 235 - 243
  • [6] Structured (De)composable Representations Trained with Neural Networks
    Spinks, Graham
    Moens, Marie-Francine
    COMPUTERS, 2020, 9 (04) : 1 - 23
  • [7] ACOUSTOOPTIC DEVICES PERFORM MANY TASKS
    TEBO, AR
    LASER FOCUS-ELECTRO-OPTICS, 1987, 23 (10): : 108 - &
  • [8] Task Discovery: Finding the Tasks that Neural Networks Generalize on
    Atanov, Andrei
    Filatov, Andrei
    Yeo, Teresa
    Sohmshetty, Ajay
    Zamir, Amir
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] FACTORS AFFECTING THE OPPORTUNITY TO PERFORM TRAINED TASKS ON THE JOB
    FORD, JK
    QUINONES, MA
    SEGO, DJ
    SORRA, JS
    PERSONNEL PSYCHOLOGY, 1992, 45 (03) : 511 - 527
  • [10] Neural Networks Trained to Solve Differential Equations Learn General Representations
    Magill, Martin
    Qureshi, Faisal Z.
    de Haan, Hendrick W.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31