Construction of Deep ReLU Nets for Spatially Sparse Learning

被引:2
|
作者
Liu, Xia [1 ]
Wang, Di [2 ]
Lin, Shao-Bo [2 ]
机构
[1] Xian Univ Technol, Sch Sci, Xian 710048, Peoples R China
[2] Xi An Jiao Tong Univ, Ctr Intelligent Decis Making & Machine Learning, Sch Management, Xian 710049, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; Neural networks; Training; Spatial resolution; Partitioning algorithms; Signal resolution; Optimization; Constructive deep net (CDN); deep learning; learning theory; spatial sparseness; NEURAL-NETWORKS; APPROXIMATION;
D O I
10.1109/TNNLS.2022.3146062
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Training an interpretable deep net to embody its theoretical advantages is difficult but extremely important in the community of machine learning. In this article, noticing the importance of spatial sparseness in signal and image processing, we develop a constructive approach to generate a deep net to capture the spatial sparseness feature. We conduct both theoretical analysis and numerical verifications to show the power of the constructive approach. Theoretically, we prove that the constructive approach can yield a deep net estimate that achieves the optimal generalization error bounds in the framework of learning theory. Numerically, we show that the constructive approach is essentially better than shallow learning in the sense that it provides better prediction accuracy with less training time.
引用
收藏
页码:7746 / 7760
页数:15
相关论文
共 50 条
  • [1] Learning sparse and smooth functions by deep Sigmoid nets
    LIU Xia
    Applied Mathematics:A Journal of Chinese Universities, 2023, 38 (02) : 293 - 309
  • [2] Learning sparse and smooth functions by deep Sigmoid nets
    Liu, Xia
    APPLIED MATHEMATICS-A JOURNAL OF CHINESE UNIVERSITIES SERIES B, 2023, 38 (02) : 293 - 309
  • [3] Learning sparse and smooth functions by deep Sigmoid nets
    Xia Liu
    Applied Mathematics-A Journal of Chinese Universities, 2023, 38 : 293 - 309
  • [4] Realization of Spatial Sparseness by Deep ReLU Nets With Massive Data
    Chui, Charles K.
    Lin, Shao-Bo
    Zhang, Bo
    Zhou, Ding-Xuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (01) : 229 - 243
  • [5] Depth Selection for Deep ReLU Nets in Feature Extraction and Generalization
    Han, Zhi
    Yu, Siquan
    Lin, Shao-Bo
    Zhou, Ding-Xuan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (04) : 1853 - 1868
  • [6] Improvement of Learning for CNN with ReLU Activation by Sparse Regularization
    Ide, Hidenori
    Kurita, Takio
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2684 - 2691
  • [7] Learning Sparse Feature Representations Using Probabilistic Quadtrees and Deep Belief Nets
    Basu, Saikat
    Karki, Manohar
    Ganguly, Sangram
    DiBiano, Robert
    Mukhopadhyay, Supratik
    Gayaka, Shreekant
    Kannan, Rajgopal
    Nemani, Ramakrishna
    NEURAL PROCESSING LETTERS, 2017, 45 (03) : 855 - 867
  • [8] Learning Sparse Feature Representations Using Probabilistic Quadtrees and Deep Belief Nets
    Saikat Basu
    Manohar Karki
    Sangram Ganguly
    Robert DiBiano
    Supratik Mukhopadhyay
    Shreekant Gayaka
    Rajgopal Kannan
    Ramakrishna Nemani
    Neural Processing Letters, 2017, 45 : 855 - 867
  • [9] New Error Bounds for Deep ReLU Networks Using Sparse Grids
    Montanelli, Hadrien
    Du, Qiang
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2019, 1 (01): : 78 - 92
  • [10] Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations
    Hanin, Boris
    MATHEMATICS, 2019, 7 (10)