Learning broad learning system with controllable sparsity through L0 regularization

被引:7
|
作者
Chu, Fei [1 ,2 ,3 ]
Wang, Guanghui [2 ]
Wang, Jun [2 ]
Chen, C. L. Philip [4 ]
Wang, Xuesong [2 ]
机构
[1] China Univ Min & Technol, Artificial Intelligence Res Inst, Xuzhou 221116, Peoples R China
[2] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou 221116, Peoples R China
[3] Beijing Gen Res Inst Min & Met, State Key Lab Automat Control Technol Min & Met P, Beijing 100160, Peoples R China
[4] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510641, Peoples R China
基金
中国国家自然科学基金;
关键词
Broad Learning System (BLS); Network compression; Sparse representation; Controllable; Normalized iterative hard thresholding; (NIHT); NEURAL-NETWORKS; APPROXIMATION;
D O I
10.1016/j.asoc.2023.110068
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a novel neural network with efficient learning capacity, broad learning system (BLS) has achieved remarkable success in various regression and classification problems. Due to the broad expansion of nodes, however, BLS is known to have many redundant parameters and nodes, which will increase the memory and computation cost and is adverse to its deployment on equipment with limited resources. To optimize the number of neurons and parameters of BLS and then find the optimal sparse model under a given resource budget, in this paper, we introduce to train BLS through L0 regularization. The regularization constraint term of the BLS objective function is replaced by the L0 regularization method, and the normalized hard threshold iterative method is used to optimize the output weight. More concretely, the size of the model is fixed by controlling the number of output weights under given the resource size, and then parameters and nodes in the network are evaluated and selected from the node set in the training to obtain a BLS with controllable sparsity (CSBLS). Experiments on various data sets demonstrate the effectiveness of our proposed method. (C) 2023 Published by Elsevier B.V.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Smoothing L0 Regularization for Extreme Learning Machine
    Fan, Qinwei
    Liu, Ting
    1600, Hindawi Limited, 410 Park Avenue, 15th Floor, 287 pmb, New York, NY 10022, United States (2020):
  • [2] Smoothing L0 Regularization for Extreme Learning Machine
    Fan, Qinwei
    Liu, Ting
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020
  • [3] Multi-parameter Tikhonov regularization with the l0 sparsity constraint
    Wang, Wei
    Lu, Shuai
    Mao, Heng
    Cheng, Jin
    INVERSE PROBLEMS, 2013, 29 (06)
  • [4] Block Dictionary Learning with l0 Regularization and Its Application in Image Denoising
    Xue, Wei
    Zhang, Wensheng
    2017 13TH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION, FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY (ICNC-FSKD), 2017,
  • [5] Learning Deep l0 Encoders
    Wang, Zhangyang
    Ling, Qing
    Huang, Thomas S.
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 2194 - 2200
  • [6] Sparse regularization with the l0 norm
    Xu, Yuesheng
    ANALYSIS AND APPLICATIONS, 2023, 21 (04) : 901 - 929
  • [7] Fuzzy clustering with L0 regularization
    Ferraro, Maria Brigida
    Forti, Marco
    Giordani, Paolo
    ANNALS OF OPERATIONS RESEARCH, 2025,
  • [8] Entropic Regularization of the l0 Function
    Borwein, Jonathan M.
    Luke, D. Russell
    FIXED-POINT ALGORITHMS FOR INVERSE PROBLEMS IN SCIENCE AND ENGINEERING, 2011, 49 : 65 - +
  • [9] A HIERARCHICAL SPARSITY-SMOOTHNESS BAYESIAN MODEL FOR l0 + l1 + l2 REGULARIZATION
    Chaari, Lotfi
    Batatia, Hadj
    Dobigeon, Nicolas
    Tourneret, Jean-Yves
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [10] Wavelet inpainting with the l0 sparse regularization
    Shen, Lixin
    Xu, Yuesheng
    Zeng, Xueying
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2016, 41 (01) : 26 - 53