Sparsity Enables Data and Energy Efficient Spiking Convolutional Neural Networks

被引:5
|
作者
Bhatt, Varun [1 ]
Ganguly, Udayan [1 ]
机构
[1] Indian Inst Technol, Dept Elect Engn, Mumbai, Maharashtra, India
关键词
Sparse coding; Unsupervised learning; Feature extraction; Spiking neural networks; Training data efficiency; NEURONS;
D O I
10.1007/978-3-030-01418-6_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
I n recent days, deep learning has surpassed human performance in image recognition tasks. A major issue with deep learning systems is their reliance on large datasets for optimal performance. When presented with a new task, generalizing from low amounts of data becomes highly attractive. Research has shown that human visual cortex might employ sparse coding to extract features from the images that we see, leading to efficient usage of available data. To ensure good generalization and energy efficiency, we create a multi-layer spiking convolutional neural network which performs layer-wise sparse coding for unsupervised feature extraction. It is applied on MNIST dataset where it achieves 92.3% accuracy with just 500 data samples, which is 4x less than what vanilla CNNs need for similar values, while reaching 98.1% accuracy with full dataset. Only around 7000 spikes are used per image (6x reduction in transferred bits per forward pass compared to CNNs) implying high sparsity. Thus, we show that our algorithm ensures better sparsity, leading to improved data and energy efficiency in learning, which is essential for some real-world applications.
引用
收藏
页码:263 / 272
页数:10
相关论文
共 50 条
  • [21] Energy efficient convolutional neural networks for arrhythmia detection
    Katsaouni, Nikoletta
    Aul, Florian
    Krischker, Lukas
    Schmalhofer, Sascha
    Hedrich, Lars
    Schulz, Marcel H.
    ARRAY, 2022, 13
  • [22] Selective Input Sparsity in Spiking Neural Networks for Pattern Classification
    Leigh, Alexander J.
    Heidarpur, Moslem
    Mirhassani, Mitra
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 799 - 803
  • [23] Efficient learning in spiking neural networks
    Rast, Alexander
    Aoun, Mario Antoine
    Elia, Eleni G.
    Crook, Nigel
    NEUROCOMPUTING, 2024, 597
  • [24] Dynamic Spike Bundling for Energy-Efficient Spiking Neural Networks
    Krithivasan, Sarada
    Sen, Sanchari
    Venkataramani, Swagath
    Raghunathan, Anand
    2019 IEEE/ACM INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN (ISLPED), 2019,
  • [25] On Reducing Activity with Distillation and Regularization for Energy Efficient Spiking Neural Networks
    Louis, Thomas
    Pegatoquet, Alain
    Miramond, Benoit
    Girard, Adrien
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT X, 2024, 15025 : 407 - 421
  • [26] Towards Energy-Efficient Sentiment Classification with Spiking Neural Networks
    Chen, Junhao
    Ye, Xiaojun
    Sun, Jingbo
    Li, Chao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 518 - 529
  • [27] On Implicit Filter Level Sparsity in Convolutional Neural Networks
    Mehta, Dushyant
    Kim, Kwang In
    Theobalt, Christian
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 520 - 528
  • [28] Deep Convolutional Spiking Neural Networks for Keyword Spotting
    Yilmaz, Emre
    Gevrek, Ozgur Bora
    Wu, Jibin
    Chen, Yuxiang
    Meng, Xuanbo
    Li, Haizhou
    INTERSPEECH 2020, 2020, : 2557 - 2561
  • [29] Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
    Yihao Luo
    Haibo Shen
    Xiang Cao
    Tianjiang Wang
    Qi Feng
    Zehan Tan
    Neural Computing and Applications, 2022, 34 : 9967 - 9982
  • [30] Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
    Luo, Yihao
    Shen, Haibo
    Cao, Xiang
    Wang, Tianjiang
    Feng, Qi
    Tan, Zehan
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (12): : 9967 - 9982