STSF: Spiking Time Sparse Feedback Learning for Spiking Neural Networks

被引:0
|
作者
He, Ping [1 ,2 ]
Xiao, Rong [1 ,2 ]
Tang, Chenwei [1 ,2 ]
Huang, Shudong [1 ,2 ]
Lv, Jiancheng [1 ,2 ]
Tang, Huajin [3 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Chengdu 610065, Peoples R China
[2] Minist Educ, Engn Res Ctr Machine Learning & Ind Intelligence, Chengdu 610065, Peoples R China
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
基金
中国国家自然科学基金;
关键词
Global-local spiking learning; sparse direct feedback alignment (DFA); spiking neural networks (SNNs); vanilla spike-timing-dependent plasticity (STDP); OPTIMIZATION; PLASTICITY; NEURONS;
D O I
10.1109/TNNLS.2025.3527700
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are biologically plausible models known for their computational efficiency. A significant advantage of SNNs lies in the binary information transmission through spike trains, eliminating the need for multiplication operations. However, due to the spatio-temporal nature of SNNs, direct application of traditional backpropagation (BP) training still results in significant computational costs. Meanwhile, learning methods based on unsupervised synaptic plasticity provide an alternative for training SNNs but often yield suboptimal results. Thus, efficiently training high-accuracy SNNs remains a challenge. In this article, we propose a highly efficient and biologically plausible spiking time sparse feedback (STSF) learning method. This algorithm modifies synaptic weights by incorporating a neuromodulator for global supervised learning using sparse direct feedback alignment (DFA) and local homeostasis learning with vanilla spike-timing-dependent plasticity (STDP). Such neuromorphic global-local learning focuses on instantaneous synaptic activity, enabling independent and simultaneous optimization of each network layer, thereby improving biological plausibility, enhancing parallelism, and reducing storage overhead. Incorporating sparse fixed random feedback connections for global error modulation, which uses selection operations instead of multiplication operations, further improves computational efficiency. Experimental results demonstrate that the proposed algorithm markedly reduces the computational cost with significantly higher accuracy comparable to current state-of-the-art algorithms across a wide range of classification tasks. Our implementation codes are available at https://github.com/hppeace/STSF.
引用
收藏
页数:14
相关论文
共 50 条
  • [31] Comparison of supervised learning methods for spike time coding in spiking neural networks
    Institute of Control and Information Engineering, Poznań University of Technology, ul. Piotrowo 3a, 60-965 Poznań, Poland
    Int. J. Appl. Math. Comput. Sci., 2006, 1 (101-113):
  • [32] Learning to Time-Decode in Spiking Neural Networks Through the Information Bottleneck
    Skatchkovsky, Nicolas
    Simeone, Osvaldo
    Jang, Hyeryung
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [33] Incorporating Learnable Membrane Time Constant to Enhance Learning of Spiking Neural Networks
    Fang, Wei
    Yu, Zhaofei
    Chen, Yanqi
    Masquelier, Timothee
    Huang, Tiejun
    Tian, Yonghong
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 2641 - 2651
  • [34] Learning to Classify Faster Using Spiking Neural Networks
    Machingal, Pranav
    Thousif
    Dora, Shirin
    Sundaram, Suresh
    Meng, Qinggang
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [35] Analysis of the ReSuMe learning process for spiking neural networks
    Ponulak, Filip
    INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE, 2008, 18 (02) : 117 - 127
  • [36] Fast Learning in Spiking Neural Networks by Learning Rate Adaptation
    Fang Huijuan
    Luo Jiliang
    Wang Fei
    CHINESE JOURNAL OF CHEMICAL ENGINEERING, 2012, 20 (06) : 1219 - 1224
  • [37] Methods of Learning for Spiking Neural Networks. A Survey
    Gavrilov, Andrey V.
    Panchenko, Konstantin O.
    2016 13TH INTERNATIONAL SCIENTIFIC-TECHNICAL CONFERENCE ON ACTUAL PROBLEMS OF ELECTRONIC INSTRUMENT ENGINEERING (APEIE), VOL 2, 2016, : 455 - 460
  • [39] Improving Learning Algorithm Performance for Spiking Neural Networks
    Fu, Qiang
    Luo, Yuling
    Liu, Junxiu
    Bi, Jinjie
    Qiu, Senhui
    Cao, Yi
    Ding, Xuemei
    2017 17TH IEEE INTERNATIONAL CONFERENCE ON COMMUNICATION TECHNOLOGY (ICCT 2017), 2017, : 1916 - 1919
  • [40] Spiking Neural Networks and online learning: An overview and perspectives
    Lobo, Jesus L.
    Del Ser, Javier
    Bifet, Albert
    Kasabov, Nikola
    NEURAL NETWORKS, 2020, 121 : 88 - 100