Learning long sequences in spiking neural networks

被引:0
|
作者
Stan, Matei-Ioan [1 ]
Rhodes, Oliver [1 ]
机构
[1] Univ Manchester, Dept Comp Sci, Manchester, England
来源
SCIENTIFIC REPORTS | 2024年 / 14卷 / 01期
基金
英国科研创新办公室;
关键词
Spiking neural networks; State space models; Sequence modelling; Long range dependencies; SYSTEMS;
D O I
10.1038/s41598-024-71678-8
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Spiking neural networks (SNNs) take inspiration from the brain to enable energy-efficient computations. Since the advent of Transformers, SNNs have struggled to compete with artificial networks on modern sequential tasks, as they inherit limitations from recurrent neural networks (RNNs), with the added challenge of training with non-differentiable binary spiking activations. However, a recent renewed interest in efficient alternatives to Transformers has given rise to state-of-the-art recurrent architectures named state space models (SSMs). This work systematically investigates, for the first time, the intersection of state-of-the-art SSMs with SNNs for long-range sequence modelling. Results suggest that SSM-based SNNs can outperform the Transformer on all tasks of a well-established long-range sequence modelling benchmark. It is also shown that SSM-based SNNs can outperform current state-of-the-art SNNs with fewer parameters on sequential image classification. Finally, a novel feature mixing layer is introduced, improving SNN accuracy while challenging assumptions about the role of binary activations in SNNs. This work paves the way for deploying powerful SSM-based architectures, such as large language models, to neuromorphic hardware for energy-efficient long-range sequence modelling.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Learning Long Temporal Sequences in Spiking Networks by Multiplexing Neural Oscillations
    Vincent-Lamarre, Philippe
    Calderini, Matias
    Thivierge, Jean-Philippe
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2020, 14 (14)
  • [2] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [3] Learning algorithm for spiking neural networks
    Amin, HH
    Fujii, RH
    ADVANCES IN NATURAL COMPUTATION, PT 1, PROCEEDINGS, 2005, 3610 : 456 - 465
  • [4] Supervised learning with spiking neural networks
    Xin, JG
    Embrechts, MJ
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 1772 - 1777
  • [5] Federated Learning With Spiking Neural Networks
    Venkatesha, Yeshwanth
    Kim, Youngeun
    Tassiulas, Leandros
    Panda, Priyadarshini
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 6183 - 6194
  • [6] Efficient learning in spiking neural networks
    Rast, Alexander
    Aoun, Mario Antoine
    Elia, Eleni G.
    Crook, Nigel
    NEUROCOMPUTING, 2024, 597
  • [7] Normative learning in spiking neural networks
    Jolivet, Renaud B.
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2024, 59 : 454 - 455
  • [8] Learning spiking neuronal networks with artificial neural networks: neural oscillations
    Zhang, Ruilin
    Wang, Zhongyi
    Wu, Tianyi
    Cai, Yuhang
    Tao, Louis
    Xiao, Zhuo-Cheng
    Li, Yao
    JOURNAL OF MATHEMATICAL BIOLOGY, 2024, 88 (06)
  • [9] Autonomous Learning Paradigm for Spiking Neural Networks
    Liu, Junxiu
    McDaid, Liam J.
    Harkin, Jim
    Karim, Shvan
    Johnson, Anju P.
    Halliday, David M.
    Tyrrell, Andy M.
    Timmis, Jon
    Millard, Alan G.
    Hilder, James
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: THEORETICAL NEURAL COMPUTATION, PT I, 2019, 11727 : 737 - 744
  • [10] Comparison of learning methods for spiking neural networks
    Kukin K.
    Sboev A.
    Optical Memory and Neural Networks (Information Optics), 2015, 24 (02): : 123 - 129