An approximate gradient descent algorithm for Spiking Neural Network

被引:0
|
作者
Chen, Wenjie [1 ]
Li, Chuandong [1 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing 400715, Peoples R China
关键词
Spiking Neural Network (SNN); MNIST; Gradient descent algorithm; Approximate derivative; LIF;
D O I
10.1109/CCDC58219.2023.10326825
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Spiking Neural Network (SNN) is the third generation of neural networks, which transmits information through the impulse train, but the discrete impulse train has the property of non-differentiable, so it is difficult to apply the gradient descent algorithm to the spiking neural network. In this paper, the approximate derivative of the impulse activity is introduced to simulate the impulse activity, and then the spiking neural network based on the gradient descent algorithm is realized. On this basis, the influence of different approximate derivatives on the training accuracy of the spiking neural network is explored, and the iterative formula of LIF (Leaky Integrate and Fired) neurons is optimized and simplified. The results show that when the approximate derivative is introduced, our neural network has lower consumption, better performance, and the accuracy of the moment function model neural network is higher. We take the MNIST data set as the input of the spiking neural network, convert it into the impulse sequence information by the frequency coding method based on the impulse counting, and transmit it through the simplified LIF neuron model. On the basis of the error back propagation rules, the synaptic weight and error deviation of the neural network are constantly updated. The results show that the proposed algorithm is of higher accuracy and faster speed.
引用
收藏
页码:4690 / 4694
页数:5
相关论文
共 50 条
  • [41] Cervical Cancer Detection Using Ensemble Neural Network Algorithm with Stochastic Gradient Descent (SGD) Optimizer
    K. Shanthi
    S. Manimekalai
    SN Computer Science, 5 (8)
  • [42] One-Pass Online Learning Based on Gradient Descent for Multilayer Spiking Neural Networks
    Lin, Xianghong
    Hu, Tiandou
    Wang, Xiangwen
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2023, 15 (01) : 16 - 31
  • [43] Approximate Computing for Spiking Neural Networks
    Sen, Sanchari
    Venkataramani, Swagath
    Raghunathan, Anand
    PROCEEDINGS OF THE 2017 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2017, : 193 - 198
  • [44] Directly training temporal Spiking Neural Network with sparse surrogate gradient
    Li, Yang
    Zhao, Feifei
    Zhao, Dongcheng
    Zeng, Yi
    NEURAL NETWORKS, 2024, 179
  • [45] A Solver plus Gradient Descent Training Algorithm for Deep Neural Networks
    Ashok, Dhananjay
    Nagisetty, Vineel
    Srinivasa, Christopher
    Ganesh, Vijay
    PROCEEDINGS OF THE THIRTY-FIRST INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2022, 2022, : 1766 - 1773
  • [46] A hybrid model of heuristic algorithm and gradient descent to optimize neural networks
    Mirkhan, Amer
    Celebi, Numan
    BULLETIN OF THE POLISH ACADEMY OF SCIENCES-TECHNICAL SCIENCES, 2023, 71 (06)
  • [47] Exponential convergence of a gradient descent algorithm for a class of recurrent neural networks
    Bartlett, P
    Dasgupta, S
    38TH MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS, PROCEEDINGS, VOLS 1 AND 2, 1996, : 497 - 500
  • [48] Stochastic Gradient Descent as Approximate Bayesian Inference
    Mandt, Stephan
    Hoffman, Matthew D.
    Blei, David M.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [49] Regularization of linear approximate schemes by the gradient descent
    Izmailov, AF
    Karmanov, VG
    Tretyakov, AA
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2001, 39 (01) : 250 - 263