Local and Global Sparsity for Deep Learning Networks

被引:0
|
作者
Zhang, Long [1 ]
Zhao, Jieyu [1 ]
Shi, Xiangfu [1 ]
Ye, Xulun [1 ]
机构
[1] Ningbo Univ, Dept Comp Sci, 818 Fenghua Rd, Ningbo 315211, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Sparsity; Regularization; Deep learning; GAN;
D O I
10.1007/978-3-319-71589-6_7
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
It has been proved that applying sparsity regularization in deep learning networks is an efficient approach. Researchers have developed several algorithms to control the sparseness of activation probability of hidden units. However, each of them has inherent limitations. In this paper, we firstly analyze weaknesses and strengths for popular sparsity algorithms, and categorize them into two groups: local and global sparsity. L-1/2 regularization is first time introduced as a global sparsity method for deep learning networks. Secondly, a combined solution is proposed to integrate local and global sparsity methods. Thirdly we customize proposed solution to fit in two deep learning networks: deep belief network (DBN) and generative adversarial network (GAN), and then test on benchmark datasets MNIST and CelebA. Experimental results show that our method outperforms existing sparsity algorithm on digits recognition, and achieves a better performance on human face generation. Additionally, proposed method could also stabilize GAN loss changes and eliminate noises.
引用
收藏
页码:74 / 85
页数:12
相关论文
共 50 条
  • [1] Learning Structured Sparsity in Deep Neural Networks
    Wen, Wei
    Wu, Chunpeng
    Wang, Yandan
    Chen, Yiran
    Li, Hai
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [2] Forecasting Multiple Groundwater Time Series with Local and Global Deep Learning Networks
    Clark, Stephanie R.
    Pagendam, Dan
    Ryan, Louise
    INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH, 2022, 19 (09)
  • [3] Local to Global Learning: Gradually Adding Classes for Training Deep Neural Networks
    Cheng, Hao
    Lian, Dongze
    Deng, Bowen
    Gao, Shenghua
    Tan, Tao
    Geng, Yanlin
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 4743 - 4751
  • [4] Maximal Sparsity with Deep Networks?
    Xin, Bo
    Wang, Yizhou
    Gao, Wen
    Wang, Baoyuan
    Wipf, David
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [5] LOCAL AND GLOBAL PROCESSING - ROLE OF SPARSITY
    MARTIN, M
    MEMORY & COGNITION, 1979, 7 (06) : 476 - 484
  • [6] FUNCTIONAL SPARSITY: GLOBAL VERSUS LOCAL
    Wang, Haonan
    Kai, Bo
    STATISTICA SINICA, 2015, 25 (04) : 1337 - 1354
  • [7] Sparseout: Controlling Sparsity in Deep Networks
    Khan, Najeeb
    Stavness, Ian
    ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, 11489 : 296 - 307
  • [8] Addressing Sparsity in Deep Neural Networks
    Zhou, Xuda
    Du, Zidong
    Zhang, Shijin
    Zhang, Lei
    Lan, Huiying
    Liu, Shaoli
    Li, Ling
    Guo, Qi
    Chen, Tianshi
    Chen, Yunji
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2019, 38 (10) : 1858 - 1871
  • [9] Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
    Hoefler, Torsten
    Alistarh, Dan
    Ben-Nun, Tal
    Dryden, Nikoli
    Peste, Alexandra
    Journal of Machine Learning Research, 2021, 22
  • [10] Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
    Hoefler, Torsten
    Alistarh, Dan
    Ben-Nun, Tal
    Dryden, Nikoli
    Peste, Alexandra
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 23