A brain-inspired algorithm that mitigates catastrophic forgetting of artificial and spiking neural networks with low computational cost

被引:23
|
作者
Zhang, Tielin [1 ,2 ,3 ]
Cheng, Xiang [1 ,2 ]
Jia, Shuncheng [1 ,2 ]
Li, Chengyu T. [3 ,4 ]
Poo, Mu-ming [3 ,4 ]
Xu, Bo [1 ,2 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China
[3] Shanghai Ctr Brain Sci & Brain inspired Technol, Lingang Lab, Shanghai 200031, Peoples R China
[4] Chinese Acad Sci, Inst Neurosci, Ctr Excellence Brain Sci & Intelligence Technol, Shanghai 200031, Peoples R China
关键词
TIMING-DEPENDENT PLASTICITY; DOPAMINE; STDP; SENSITIVITY; MODULATION; RECEPTORS;
D O I
10.1126/sciadv.adi2947
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Neuromodulators in the brain act globally at many forms of synaptic plasticity, represented as metaplasticity, which is rarely considered by existing spiking (SNNs) and nonspiking artificial neural networks (ANNs). Here, we report an efficient brain-inspired computing algorithm for SNNs and ANNs, referred to here as neuromodulation-assisted credit assignment (NACA), which uses expectation signals to induce defined levels of neuromodulators to selective synapses, whereby the long-term synaptic potentiation and depression are modified in a nonlinear manner depending on the neuromodulator level. The NACA algorithm achieved high recognition accuracy with substantially reduced computational cost in learning spatial and temporal classification tasks. Notably, NACA was also verified as efficient for learning five different class continuous learning tasks with varying degrees of complexity, exhibiting a markedly mitigated catastrophic forgetting at low computational cost. Mapping synaptic weight changes showed that these benefits could be explained by the sparse and targeted synaptic modifications attributed to expectation-based global neuromodulation.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Deep Learning in Spiking Neural Networks for Brain-Inspired Artificial intelligence
    Kasabov, Nikola
    COMPUTER SYSTEMS AND TECHNOLOGIES (COMPSYSTECH'18), 2018, 1641 : 1 - 1
  • [2] Brain-Inspired Architecture for Spiking Neural Networks
    Tang, Fengzhen
    Zhang, Junhuai
    Zhang, Chi
    Liu, Lianqing
    BIOMIMETICS, 2024, 9 (10)
  • [3] Brain-inspired neural circuit evolution for spiking neural networks
    Shen, Guobin
    Zhao, Dongcheng
    Dong, Yiting
    Zeng, Yi
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2023, 120 (39)
  • [4] Brain-inspired Evolutionary Architectures for Spiking Neural Networks
    Pan W.
    Zhao F.
    Zhao Z.
    Zeng Y.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (11): : 1 - 10
  • [5] Brain-inspired Balanced Tuning for Spiking Neural Networks
    Zhang, Tielin
    Zeng, Yi
    Zhao, Dongcheng
    Xu, Bo
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 1653 - 1659
  • [6] Brain-inspired wiring economics for artificial neural networks
    Zhang, Xin-Jie
    Moore, Jack Murdoch
    Gao, Ting-Ting
    Zhang, Xiaozhu
    Yan, Gang
    PNAS NEXUS, 2025, 4 (01):
  • [7] A Brain-Inspired Causal Reasoning Model Based on Spiking Neural Networks
    Fang, Hongjian
    Zeng, Yi
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [8] Brain-inspired replay for continual learning with artificial neural networks
    Gido M. van de Ven
    Hava T. Siegelmann
    Andreas S. Tolias
    Nature Communications, 11
  • [9] Brain-inspired replay for continual learning with artificial neural networks
    van de Ven, Gido M.
    Siegelmann, Hava T.
    Tolias, Andreas S.
    NATURE COMMUNICATIONS, 2020, 11 (01)
  • [10] A brain-inspired algorithm for training highly sparse neural networks
    Zahra Atashgahi
    Joost Pieterse
    Shiwei Liu
    Decebal Constantin Mocanu
    Raymond Veldhuis
    Mykola Pechenizkiy
    Machine Learning, 2022, 111 : 4411 - 4452