Learning broad learning system with controllable sparsity through L0 regularization

被引:7
|
作者
Chu, Fei [1 ,2 ,3 ]
Wang, Guanghui [2 ]
Wang, Jun [2 ]
Chen, C. L. Philip [4 ]
Wang, Xuesong [2 ]
机构
[1] China Univ Min & Technol, Artificial Intelligence Res Inst, Xuzhou 221116, Peoples R China
[2] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou 221116, Peoples R China
[3] Beijing Gen Res Inst Min & Met, State Key Lab Automat Control Technol Min & Met P, Beijing 100160, Peoples R China
[4] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510641, Peoples R China
基金
中国国家自然科学基金;
关键词
Broad Learning System (BLS); Network compression; Sparse representation; Controllable; Normalized iterative hard thresholding; (NIHT); NEURAL-NETWORKS; APPROXIMATION;
D O I
10.1016/j.asoc.2023.110068
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a novel neural network with efficient learning capacity, broad learning system (BLS) has achieved remarkable success in various regression and classification problems. Due to the broad expansion of nodes, however, BLS is known to have many redundant parameters and nodes, which will increase the memory and computation cost and is adverse to its deployment on equipment with limited resources. To optimize the number of neurons and parameters of BLS and then find the optimal sparse model under a given resource budget, in this paper, we introduce to train BLS through L0 regularization. The regularization constraint term of the BLS objective function is replaced by the L0 regularization method, and the normalized hard threshold iterative method is used to optimize the output weight. More concretely, the size of the model is fixed by controlling the number of output weights under given the resource size, and then parameters and nodes in the network are evaluated and selected from the node set in the training to obtain a BLS with controllable sparsity (CSBLS). Experiments on various data sets demonstrate the effectiveness of our proposed method. (C) 2023 Published by Elsevier B.V.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] Broad Multitask Learning System With Group Sparse Regularization
    Huang, Jintao
    Chen, Chuangquan
    Vong, Chi-Man
    Cheung, Yiu-Ming
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [22] HYPER-PARAMETER SELECTION ON CONVOLUTIONAL DICTIONARY LEARNING THROUGH LOCAL l0,∞ NORM
    Silva, Gustavo
    Quesada, Jorge
    Rodriguez, Paul
    2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2019,
  • [23] Heterogeneous Representation Learning with Structured Sparsity Regularization
    Yang, Pei
    He, Jingrui
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 539 - 548
  • [24] DICTIONARY LEARNING FOR SPARSE REPRESENTATION BASED ON SMOOTHED L0 NORM
    Akhavan, S.
    Soltanian-Zadeh, H.
    2017 24TH NATIONAL AND 2ND INTERNATIONAL IRANIAN CONFERENCE ON BIOMEDICAL ENGINEERING (ICBME), 2017, : 278 - 283
  • [25] Image decomposition model OSV with L0 sparse regularization
    Wang, Guodong
    Xu, Jie
    Pan, Zhenkuan
    Zhang, Weizhong
    Diao, Zhaojing
    Journal of Information and Computational Science, 2015, 12 (02): : 743 - 750
  • [26] CAPPED lp APPROXIMATIONS FOR THE COMPOSITE l0 REGULARIZATION PROBLEM
    Li, Qia
    Zhang, Na
    INVERSE PROBLEMS AND IMAGING, 2018, 12 (05) : 1219 - 1243
  • [27] Adaptive L0 Regularization for Sparse Support Vector Regression
    Christou, Antonis
    Artemiou, Andreas
    MATHEMATICS, 2023, 11 (13)
  • [28] Blind Inpainting Using l0 and Total Variation Regularization
    Afonso, Manya V.
    Raposo Sanches, Joao Miguel
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (07) : 2239 - 2253
  • [29] Sparse hyperspectral unmixing based on smoothed l0 regularization
    Deng, Chengzhi
    Zhang, Shaoquan
    Wang, Shengqian
    Tian, Wei
    Wu, Zhaoming
    INFRARED PHYSICS & TECHNOLOGY, 2014, 67 : 306 - 314
  • [30] LEARNING OBJECT-RELATIVE SPATIAL CONCEPTS IN THE L0 PROJECT
    REGIER, T
    PROGRAM OF THE THIRTEENTH ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, 1991, : 191 - 196