Complexity of Deep Convolutional Neural Networks in Mobile Computing

被引:3
|
作者
Naeem, Saad [1 ]
Jamil, Noreen [1 ]
Khan, Habib Ullah [2 ]
Nazir, Shah [3 ]
机构
[1] Natl Univ Comp & Emerging Sci, Dept Comp Sci, Islamabad, Pakistan
[2] Qatar Univ, Coll Business & Econ, Dept Accounting & Informat Syst, Doha, Qatar
[3] Univ Swabi, Dept Comp Sci, Swabi, Pakistan
关键词
Convolutional neural networks - Deep neural networks - Encoding (symbols) - Signal encoding - Digital storage;
D O I
10.1155/2020/3853780
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Neural networks employ massive interconnection of simple computing units called neurons to compute the problems that are highly nonlinear and could not be hard coded into a program. These neural networks are computation-intensive, and training them requires a lot of training data. Each training example requires heavy computations. We look at different ways in which we can reduce the heavy computation requirement and possibly make them work on mobile devices. In this paper, we survey various techniques that can be matched and combined in order to improve the training time of neural networks. Additionally, we also review some extra recommendations to make the process work for mobile devices as well. We finally survey deep compression technique that tries to solve the problem by network pruning, quantization, and encoding the network weights. Deep compression reduces the time required for training the network by first pruning the irrelevant connections, i.e., the pruning stage, which is then followed by quantizing the network weights via choosing centroids for each layer. Finally, at the third stage, it employs Huffman encoding algorithm to deal with the storage issue of the remaining weights.
引用
收藏
页数:8
相关论文
共 50 条
  • [21] DEEP CONVOLUTIONAL NEURAL NETWORKS FOR LVCSR
    Sainath, Tara N.
    Mohamed, Abdel-rahman
    Kingsbury, Brian
    Ramabhadran, Bhuvana
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8614 - 8618
  • [22] Universality of deep convolutional neural networks
    Zhou, Ding-Xuan
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2020, 48 (02) : 787 - 794
  • [23] A Review on Deep Convolutional Neural Networks
    Aloysius, Neena
    Geetha, M.
    2017 INTERNATIONAL CONFERENCE ON COMMUNICATION AND SIGNAL PROCESSING (ICCSP), 2017, : 588 - 592
  • [24] Convergence of deep convolutional neural networks
    Xu, Yuesheng
    Zhang, Haizhang
    NEURAL NETWORKS, 2022, 153 : 553 - 563
  • [25] Spatial deep convolutional neural networks
    Wang, Qi
    Parker, Paul A.
    Lund, Robert
    SPATIAL STATISTICS, 2025, 66
  • [26] Fusion of Deep Convolutional Neural Networks
    Suchy, Robert
    Ezekiel, Soundararajan
    Cornacchia, Maria
    2017 IEEE APPLIED IMAGERY PATTERN RECOGNITION WORKSHOP (AIPR), 2017,
  • [27] Computing nasalance with MFCCs and Convolutional Neural Networks
    Lozano, Andres
    Nava, Enrique
    Garcia Mendez, Maria Dolores
    Moreno-Torres, Ignacio
    PLOS ONE, 2024, 19 (12):
  • [28] Computing receptive fields of convolutional neural networks
    Araujo, André
    Norris, Wade
    Sim, Jack
    Distill, 2019, 4 (11):
  • [29] Fast Computing Framework for Convolutional Neural Networks
    Korytkowski, Marcin
    Staszewski, Pawel
    Woldan, Piotr
    Scherer, Rafal
    PROCEEDINGS OF 2016 IEEE INTERNATIONAL CONFERENCES ON BIG DATA AND CLOUD COMPUTING (BDCLOUD 2016) SOCIAL COMPUTING AND NETWORKING (SOCIALCOM 2016) SUSTAINABLE COMPUTING AND COMMUNICATIONS (SUSTAINCOM 2016) (BDCLOUD-SOCIALCOM-SUSTAINCOM 2016), 2016, : 118 - 123
  • [30] Hartley Stochastic Computing For Convolutional Neural Networks
    Mozafari, S. H.
    Clark, J. J.
    Gross, W. J.
    Meyer, B. H.
    2021 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS 2021), 2021, : 235 - 240