Efficient training of supervised spiking neural networks via the normalized perceptron based learning rule

被引:22
|
作者
Xie, Xiurui [1 ]
Qu, Hong [1 ]
Liu, Guisong [1 ]
Zhang, Malu [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 610054, Peoples R China
关键词
Spiking neural networks; Temporal encoding mechanism; Supervised learning; Perceptron based learning rule; FUNCTIONAL ARCHITECTURE; RECEPTIVE-FIELDS; CLASSIFICATION; OSCILLATIONS; ALGORITHM; RESUME;
D O I
10.1016/j.neucom.2017.01.086
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The spiking neural networks (SNNs) are the third generation of artificial neural networks, which have made great achievements in the field of pattern recognition. However, the existing supervised training methods of SNNs are not efficient enough to meet the real-time requirement in most cases. To address this issue, the normalized perceptron based learning rule (NPBLR) is proposed in this paper for the supervised training of the multi-layer SNNs. Different from traditional methods, our algorithm only trains the selected misclassified time points and the target ones, employing the perceptron based neuron. Furthermore, the weight modification in our algorithm is normalized by a voltage based function, which is more efficient than the traditional time based method because the firing time is calculated by the voltage value. Superior to the traditional multi-layer algorithm ignoring the time accumulation of spikes, our algorithm defines the spiking activity of the postsynaptic neuron as the rate accumulation function of all presynaptic neurons in a specific time-frame. By these strategies, our algorithm overcomes some difficulties in the training of SNNs, e.g., the inefficient and no-fire problems. Comprehensive simulations are conducted both in single and multi-layer networks to investigate the learning performance of our algorithm, whose results demonstrate that our algorithm possesses higher learning efficiency and stronger parameter robustness than traditional algorithms. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:152 / 163
页数:12
相关论文
共 50 条
  • [1] Supervised learning in spiking neural networks with FORCE training
    Wilten Nicola
    Claudia Clopath
    Nature Communications, 8
  • [2] Supervised learning in spiking neural networks with FORCE training
    Nicola, Wilten
    Clopath, Claudia
    NATURE COMMUNICATIONS, 2017, 8
  • [3] The maximum points-based supervised learning rule for spiking neural networks
    Xie, Xiurui
    Liu, Guisong
    Cai, Qing
    Qu, Hong
    Zhang, Malu
    SOFT COMPUTING, 2019, 23 (20) : 10187 - 10198
  • [4] The maximum points-based supervised learning rule for spiking neural networks
    Xiurui Xie
    Guisong Liu
    Qing Cai
    Hong Qu
    Malu Zhang
    Soft Computing, 2019, 23 : 10187 - 10198
  • [5] An Efficient Supervised Training Algorithm for Multilayer Spiking Neural Networks
    Xie, Xiurui
    Qu, Hong
    Liu, Guisong
    Zhang, Malu
    Kurths, Juergen
    PLOS ONE, 2016, 11 (04):
  • [6] A Supervised Learning Rule for Recurrent Spiking Neural Networks with Weighted Spikes
    Shi, Guoyong
    Liang, Jungang
    Cui, Yong
    2022 IEEE 34TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2022, : 522 - 527
  • [7] NormAD - Normalized Approximate Descent based Supervised Learning Rule for Spiking Neurons
    Anwani, Navin
    Rajendran, Bipin
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [8] Efficient and Robust Supervised Learning Algorithm for Spiking Neural Networks
    Zhang Y.
    Geng T.
    Zhang M.
    Wu X.
    Zhou J.
    Qu H.
    Sensing and Imaging, 2018, 19 (1):
  • [9] Supervised learning with spiking neural networks
    Xin, JG
    Embrechts, MJ
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 1772 - 1777
  • [10] Supervised Learning Based on Temporal Coding in Spiking Neural Networks
    Mostafa, Hesham
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (07) : 3227 - 3235