Learning to Time-Decode in Spiking Neural Networks Through the Information Bottleneck

被引:0
|
作者
Skatchkovsky, Nicolas [1 ]
Simeone, Osvaldo [1 ]
Jang, Hyeryung [2 ]
机构
[1] Kings Coll London, KCLIP Lab, Dept Engn, London, England
[2] Dongguk Univ, ION Grp, Dept AI, Seoul, South Korea
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021) | 2021年 / 34卷
基金
新加坡国家研究基金会; 欧洲研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One of the key challenges in training Spiking Neural Networks (SNNs) is that target outputs typically come in the form of natural signals, such as labels for classification or images for generative models, and need to be encoded into spikes. This is done by handcrafting target spiking signals, which in turn implicitly fixes the mechanisms used to decode spikes into natural signals, e.g., rate decoding. The arbitrary choice of target signals and decoding rule generally impairs the capacity of the SNN to encode and process information in the timing of spikes. To address this problem, this work introduces a hybrid variational autoencoder architecture, consisting of an encoding SNN and a decoding Artificial Neural Network (ANN). The role of the decoding ANN is to learn how to best convert the spiking signals output by the SNN into the target natural signal. A novel end-to-end learning rule is introduced that optimizes a directed information bottleneck training criterion via surrogate gradients. We demonstrate the applicability of the technique in an experimental settings on various tasks, including real-life datasets.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Information coding and hardware architecture of spiking neural networks
    Abderrahmane, Nassim
    Miramond, Benoit
    2019 22ND EUROMICRO CONFERENCE ON DIGITAL SYSTEM DESIGN (DSD), 2019, : 291 - 298
  • [42] Evolving spiking neural networks for audiovisual information processing
    Wysoski, Simei Gomes
    Benuskova, Lubica
    Kasabov, Nikola
    NEURAL NETWORKS, 2010, 23 (07) : 819 - 835
  • [43] Spike Based Information Processing in Spiking Neural Networks
    Sheik, Sadique
    PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON APPLICATIONS IN NONLINEAR DYNAMICS (ICAND 2016), 2017, 6 : 177 - 188
  • [44] Exploring Temporal Information Dynamics in Spiking Neural Networks
    Kim, Youngeun
    Li, Yuhang
    Park, Hyoungseob
    Venkatesha, Yeshwanth
    Hambitzer, Anna
    Panda, Priyadarshini
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 7, 2023, : 8308 - 8316
  • [45] DSNNs: learning transfer from deep neural networks to spiking neural networks
    Zhang L.
    Du Z.
    Li L.
    Chen Y.
    High Technology Letters, 2020, 26 (02): : 136 - 144
  • [46] DSNNs:learning transfer from deep neural networks to spiking neural networks
    张磊
    Du Zidong
    Li Ling
    Chen Yunji
    HighTechnologyLetters, 2020, 26 (02) : 136 - 144
  • [47] Fast Learning in Spiking Neural Networks by Learning Rate Adaptation
    Fang Huijuan
    Luo Jiliang
    Wang Fei
    CHINESE JOURNAL OF CHEMICAL ENGINEERING, 2012, 20 (06) : 1219 - 1224
  • [48] A compound memristive synapse model for statistical learning through STDP in spiking neural networks
    Bill, Johannes
    Legenstein, Robert
    FRONTIERS IN NEUROSCIENCE, 2014, 8
  • [49] Memory-Dependent Computation and Learning in Spiking Neural Networks Through Hebbian Plasticity
    Limbacher, Thomas
    Ozdenizci, Ozan
    Legenstein, Robert
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 12
  • [50] Learning delays through gradients and structure: emergence of spatiotemporal patterns in spiking neural networks
    Mészáros, Balázs
    Knight, James C.
    Nowotny, Thomas
    Frontiers in Computational Neuroscience, 2024, 18