Learning broad learning system with controllable sparsity through L0 regularization

被引:7
|
作者
Chu, Fei [1 ,2 ,3 ]
Wang, Guanghui [2 ]
Wang, Jun [2 ]
Chen, C. L. Philip [4 ]
Wang, Xuesong [2 ]
机构
[1] China Univ Min & Technol, Artificial Intelligence Res Inst, Xuzhou 221116, Peoples R China
[2] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou 221116, Peoples R China
[3] Beijing Gen Res Inst Min & Met, State Key Lab Automat Control Technol Min & Met P, Beijing 100160, Peoples R China
[4] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510641, Peoples R China
基金
中国国家自然科学基金;
关键词
Broad Learning System (BLS); Network compression; Sparse representation; Controllable; Normalized iterative hard thresholding; (NIHT); NEURAL-NETWORKS; APPROXIMATION;
D O I
10.1016/j.asoc.2023.110068
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a novel neural network with efficient learning capacity, broad learning system (BLS) has achieved remarkable success in various regression and classification problems. Due to the broad expansion of nodes, however, BLS is known to have many redundant parameters and nodes, which will increase the memory and computation cost and is adverse to its deployment on equipment with limited resources. To optimize the number of neurons and parameters of BLS and then find the optimal sparse model under a given resource budget, in this paper, we introduce to train BLS through L0 regularization. The regularization constraint term of the BLS objective function is replaced by the L0 regularization method, and the normalized hard threshold iterative method is used to optimize the output weight. More concretely, the size of the model is fixed by controlling the number of output weights under given the resource size, and then parameters and nodes in the network are evaluated and selected from the node set in the training to obtain a BLS with controllable sparsity (CSBLS). Experiments on various data sets demonstrate the effectiveness of our proposed method. (C) 2023 Published by Elsevier B.V.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] IMAGE SMOOTHING VIA A NOVEL ADAPTIVE WEIGHTED L0 REGULARIZATION
    Zhao, Wufan
    Wu, Tingting
    Feng, Chenchen
    Wu, Wenna
    Lv, Xiaoguang
    Chen, Hongming
    Liu, Jun
    INTERNATIONAL JOURNAL OF NUMERICAL ANALYSIS AND MODELING, 2025, 22 (01) : 21 - 39
  • [42] Robust and effective mesh denoising using L0 sparse regularization
    Zhao, Yong
    Qin, Hong
    Zeng, Xueying
    Xu, Junli
    Dong, Junyu
    COMPUTER-AIDED DESIGN, 2018, 101 : 82 - 97
  • [43] Backpropagation With Sparsity Regularization for Spiking Neural Network Learning
    Yan, Yulong
    Chu, Haoming
    Jin, Yi
    Huan, Yuxiang
    Zou, Zhuo
    Zheng, Lirong
    FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [44] Sparsity regularization label propagation for domain adaptation learning
    Tao, JianWen
    Hu, Wenjun
    Wang, Shitong
    NEUROCOMPUTING, 2014, 139 : 202 - 219
  • [45] Heterogeneous representation learning with separable structured sparsity regularization
    Yang, Pei
    Tan, Qi
    Zhu, Yada
    He, Jingrui
    KNOWLEDGE AND INFORMATION SYSTEMS, 2018, 55 (03) : 671 - 694
  • [46] Heterogeneous representation learning with separable structured sparsity regularization
    Pei Yang
    Qi Tan
    Yada Zhu
    Jingrui He
    Knowledge and Information Systems, 2018, 55 : 671 - 694
  • [47] Discriminative group-sparsity constrained broad learning system for visual recognition
    Jin, Junwei
    Li, Yanting
    Yang, Tiejun
    Zhao, Liang
    Duan, Junwei
    Chen, C. L. Philip
    INFORMATION SCIENCES, 2021, 576 : 800 - 818
  • [48] A latent representation dual manifold regularization broad learning system with incremental learning capability for fault diagnosis
    Mou, Miao
    Zhao, Xiaoqiang
    Liu, Kai
    Cao, Shiyu
    Hui, Yongyong
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (07)
  • [49] A communication-efficient method for l0 regularization linear regression models
    Kang, Lican
    Li, Xuerui
    Liu, Yanyan
    Luo, Yuan
    Zou, Zhikang
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2023, 93 (04) : 533 - 555
  • [50] Online gradient method with smoothing l0 regularization for feedforward neural networks
    Zhang, Huisheng
    Tang, Yanli
    NEUROCOMPUTING, 2017, 224 : 1 - 8