An approximate gradient descent algorithm for Spiking Neural Network

被引:0
|
作者
Chen, Wenjie [1 ]
Li, Chuandong [1 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing 400715, Peoples R China
关键词
Spiking Neural Network (SNN); MNIST; Gradient descent algorithm; Approximate derivative; LIF;
D O I
10.1109/CCDC58219.2023.10326825
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Spiking Neural Network (SNN) is the third generation of neural networks, which transmits information through the impulse train, but the discrete impulse train has the property of non-differentiable, so it is difficult to apply the gradient descent algorithm to the spiking neural network. In this paper, the approximate derivative of the impulse activity is introduced to simulate the impulse activity, and then the spiking neural network based on the gradient descent algorithm is realized. On this basis, the influence of different approximate derivatives on the training accuracy of the spiking neural network is explored, and the iterative formula of LIF (Leaky Integrate and Fired) neurons is optimized and simplified. The results show that when the approximate derivative is introduced, our neural network has lower consumption, better performance, and the accuracy of the moment function model neural network is higher. We take the MNIST data set as the input of the spiking neural network, convert it into the impulse sequence information by the frequency coding method based on the impulse counting, and transmit it through the simplified LIF neuron model. On the basis of the error back propagation rules, the synaptic weight and error deviation of the neural network are constantly updated. The results show that the proposed algorithm is of higher accuracy and faster speed.
引用
收藏
页码:4690 / 4694
页数:5
相关论文
共 50 条
  • [31] Convergence of Gradient Descent Algorithm for Diagonal Recurrent Neural Networks
    Xu, Dongpo
    Li, Zhengxue
    Wu, Wei
    Ding, Xiaoshuai
    Qu, Di
    2007 SECOND INTERNATIONAL CONFERENCE ON BIO-INSPIRED COMPUTING: THEORIES AND APPLICATIONS, 2007, : 29 - 31
  • [33] A progressive surrogate gradient learning for memristive spiking neural network
    Wang, Shu
    Chen, Tao
    Gong, Yu
    Sun, Fan
    Shen, Si-Yuan
    Duan, Shu-Kai
    Wang, Li-Dan
    CHINESE PHYSICS B, 2023, 32 (06)
  • [34] Fast gradient descent algorithm for image classification with neural networks
    Abdelkrim El Mouatasim
    Signal, Image and Video Processing, 2020, 14 : 1565 - 1572
  • [35] IMAGE RECOGNITION ALGORITHM BASED ON SPIKING NEURAL NETWORK
    Xiao Fei
    Li Jianping
    Tian Jie
    Wang Guangshuo
    2022 19TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICCWAMTIP), 2022,
  • [36] STADIA: Photonic Stochastic Gradient Descent for Neural Network Accelerators
    Xia, Chengpeng
    Chen, Yawen
    Zhang, Haibo
    Wu, Jigang
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (05)
  • [37] A Fractional Gradient Descent-Based RBF Neural Network
    Khan, Shujaat
    Naseem, Imran
    Malik, Muhammad Ammar
    Togneri, Roberto
    Bennamoun, Mohammed
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2018, 37 (12) : 5311 - 5332
  • [38] A Fractional Gradient Descent-Based RBF Neural Network
    Shujaat Khan
    Imran Naseem
    Muhammad Ammar Malik
    Roberto Togneri
    Mohammed Bennamoun
    Circuits, Systems, and Signal Processing, 2018, 37 : 5311 - 5332
  • [39] Classification of Honey as Genuine or Fake via Artificial Neural Network using Gradient Descent Backpropagation Algorithm
    Hortinela, Carlos C.
    Balbin, Jessie R.
    Tibayan, Patrick Jonas A.
    Cabela, John Myrrh D.
    Magwili, Glenn, V
    2020 IEEE 12TH INTERNATIONAL CONFERENCE ON HUMANOID, NANOTECHNOLOGY, INFORMATION TECHNOLOGY, COMMUNICATION AND CONTROL, ENVIRONMENT, AND MANAGEMENT (HNICEM), 2020,
  • [40] Training qubit neural network with hybrid genetic algorithm and gradient descent for indirect adaptive controller design
    Ganjefar, Soheil
    Tofighi, Morteza
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2017, 65 : 346 - 360