Sparsity-Aware Deep Learning Accelerator Design Supporting CNN and LSTM Operations

被引:0
|
作者
Hsiao, Shen-Fu [1 ]
Chang, Hsuan-Jui [1 ]
机构
[1] Natl Sun Yat Sen Univ, Dept Comp Sci & Engn, Kaohsiung, Taiwan
关键词
sparsity; convolutional neural networks; convolutional layers; fully-connected layers; long-short-term memory (LSTM); PROCESSOR;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Sparsity of data and weights appears in many convolution neural networks (CNN) and recurrent neural networks such as long-short term memory (LSTM). In this paper, we design a sparsity-aware deep learning hardware accelerator exploiting both data and weight sparsity in CNN and LSTM models. The proposed hardware accelerator significantly reduces memory accesses and computations, leading to much lower power consumption.
引用
收藏
页数:4
相关论文
共 50 条
  • [21] A Convolutional Spiking Neural Network Accelerator with the Sparsity-aware Memory and Compressed Weights
    Liu, Hanqing
    Cui, Xiaole
    Zhang, Sunrui
    Yin, Mingqi
    Jiang, Yuanyuan
    Cui, Xiaoxin
    2024 IEEE 35TH INTERNATIONAL CONFERENCE ON APPLICATION-SPECIFIC SYSTEMS, ARCHITECTURES AND PROCESSORS, ASAP 2024, 2024, : 163 - 171
  • [22] Locating the Few: Sparsity-Aware Waveform Design for Active Radar
    Hu, Heng
    Soltanalian, Mojtaba
    Stoica, Petre
    Zhu, Xiaohua
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (03) : 651 - 662
  • [23] Generalized Thresholding and Online Sparsity-Aware Learning in a Union of Subspaces
    Slavakis, Konstantinos
    Kopsinis, Yannis
    Theodoridis, Sergios
    McLaughlin, Stephen
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2013, 61 (15) : 3760 - 3773
  • [24] A Flexible Sparsity-Aware Accelerator with High Sensitivity and Efficient Operation for Convolutional Neural Networks
    Haiying Yuan
    Zhiyong Zeng
    Junpeng Cheng
    Minghao Li
    Circuits, Systems, and Signal Processing, 2022, 41 : 4370 - 4389
  • [25] A Flexible Sparsity-Aware Accelerator with High Sensitivity and Efficient Operation for Convolutional Neural Networks
    Yuan, Haiying
    Zeng, Zhiyong
    Cheng, Junpeng
    Li, Minghao
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2022, 41 (08) : 4370 - 4389
  • [26] Parallax: Sparsity-aware Data Parallel Training of Deep Neural Networks
    Kim, Soojeong
    Yu, Gyeong-In
    Park, Hojin
    Cho, Sungwoo
    Jeong, Eunji
    Ha, Hyeonmin
    Lee, Sanha
    Jeong, Joo Seong
    Chun, Byung-Gon
    PROCEEDINGS OF THE FOURTEENTH EUROSYS CONFERENCE 2019 (EUROSYS '19), 2019,
  • [27] Sparsity-Aware Tight Frame Learning for Rotary Machine Fault Diagnosis
    Zhang, Han
    Chen, Xuefeng
    Du, Zhaohui
    Ma, Meng
    Zhang, Xiaoli
    2016 IEEE INTERNATIONAL INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE PROCEEDINGS, 2016, : 819 - 824
  • [28] Sparsity-Aware Distributed Learning for Gaussian Processes With Linear Multiple Kernel
    Suwandi, Richard Cornelius
    Lin, Zhidi
    Yin, Feng
    Wang, Zhiguo
    Theodoridis, Sergios
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025,
  • [29] TRAINING-BASED AND BLIND ALGORITHMS FOR SPARSITY-AWARE DISTRIBUTED LEARNING
    Chouvardas, Symeon
    Mileounis, Gerasimos
    Kalouptsidis, Nicholas
    Theodoridis, Sergios
    2013 PROCEEDINGS OF THE 21ST EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2013,
  • [30] GENERALIZED THRESHOLDING SPARSITY-AWARE ALGORITHM FOR LOW COMPLEXITY ONLINE LEARNING
    Kopsinis, Yannis
    Slavakis, Konstantinos
    Theodoridis, Sergios
    McLaughlin, Steve
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 3277 - 3280