STATISTICAL DYNAMICS OF LEARNING PROCESSES IN SPIKING NEURAL NETWORKS

被引:0
|
作者
Hyland, David C. [1 ]
机构
[1] Texas A&M Univ, College Stn, TX 77843 USA
来源
关键词
D O I
暂无
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
In previous work, the author and Dr. Jer-Nan Juang contributed a new neural net architecture, within the framework of "second generation" neural models. We showed how to implement backpropagation learning in a massively parallel architecture involving only local computations - thereby capturing one of the principal advantages of biological neural nets. Since then, a large body of neural-biological research has given rise to the "third-generation" models, namely spiking neural nets, wherein the brief, sharp pulses (spikes) produced by neurons are explicitly modeled. Information is encoded not in average firing rates, but in the temporal pattern of the spikes. Further, no physiological basis for backpropagation has been found, rather, synaptic plasticity is driven by the timing of spikes. The present paper examines the statistical dynamics of learning processes in spiking neural nets. Equations describing the evolution of synaptic efficacies and the probability distributions of the neural states are derived. Although the system is strongly nonlinear, the typically large number of synapses per neuron (similar to 10,000) permits us to obtain a closed system of equations. As in the earlier work, we see that the learning process in this more realistic setting is dominated by local interactions; thereby preserving massive parallelism. It is hoped that the formulation given here will provide the basis for the rigorous analysis of learning dynamics in very large neural nets (10(10) neurons in the human brain!) for which direct simulation is difficult or impractical.
引用
收藏
页码:363 / 378
页数:16
相关论文
共 50 条
  • [31] DSNNs: learning transfer from deep neural networks to spiking neural networks
    Zhang L.
    Du Z.
    Li L.
    Chen Y.
    High Technology Letters, 2020, 26 (02): : 136 - 144
  • [32] DSNNs:learning transfer from deep neural networks to spiking neural networks
    张磊
    Du Zidong
    Li Ling
    Chen Yunji
    HighTechnologyLetters, 2020, 26 (02) : 136 - 144
  • [33] Fast Learning in Spiking Neural Networks by Learning Rate Adaptation
    Fang Huijuan
    Luo Jiliang
    Wang Fei
    CHINESE JOURNAL OF CHEMICAL ENGINEERING, 2012, 20 (06) : 1219 - 1224
  • [34] Macroscopic dynamics of neural networks with heterogeneous spiking thresholds
    Gast, Richard
    Solla, Sara A.
    Kennedy, Ann
    PHYSICAL REVIEW E, 2023, 107 (02)
  • [35] Effects of Spike Anticipation on the Spiking Dynamics of Neural Networks
    de Santos-Sierra, Daniel
    Sanchez-Jimenez, Abel
    Garcia-Vellisca, Mariano A.
    Navas, Adrian
    Villacorta-Atienza, Jose A.
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2015, 9
  • [36] Exploring Temporal Information Dynamics in Spiking Neural Networks
    Kim, Youngeun
    Li, Yuhang
    Park, Hyoungseob
    Venkatesha, Yeshwanth
    Hambitzer, Anna
    Panda, Priyadarshini
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 7, 2023, : 8308 - 8316
  • [37] Neural Dynamics Pruning for Energy-Efficient Spiking Neural Networks
    Huang, Haoyu
    He, Linxuan
    Liu, Faqiang
    Zhao, Rong
    Shi, Luping
    2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME 2024, 2024,
  • [38] Learning to Classify Faster Using Spiking Neural Networks
    Machingal, Pranav
    Thousif
    Dora, Shirin
    Sundaram, Suresh
    Meng, Qinggang
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [39] Analysis of the ReSuMe learning process for spiking neural networks
    Ponulak, Filip
    INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE, 2008, 18 (02) : 117 - 127
  • [40] Supervised learning in spiking neural networks with FORCE training
    Wilten Nicola
    Claudia Clopath
    Nature Communications, 8