Compact Deep Neural Networks with l1,1 and l1,2 Regularization

被引:0
|
作者
Ma, Rongrong [1 ]
Niu, Lingfeng [2 ]
机构
[1] Univ Chinese Acad Sci, Sch Math Sci, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
deep neural networks; sparse regularizer; l(1,1); l(1,2);
D O I
10.1109/ICDMW.2018.00178
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep neural networks have demonstrated its superiority in many fields. Its excellent performance relys on quite a lot of parameters used in the network, resulting in a series of problems, including memory and computation requirement and overfitting, which seriously impede the application of deep neural networks in many assignments in practice. A considerable number of model compression methods have been proposed in deep neural networks to reduce the number of parameters used in networks, among which there is one kind of methods persuing sparsity in deep neural networks. In this paper, we propose to combine l(1,1) and l(1,2) norm together as the regularization term to regularize the objective function of the network. We introduce group and l(1,1) can zero out weights in both intergroup and intra-group level. l(1,2) regularizer can obtain intragroup level sparsity and cause even weights among groups. We adopt proximal gradient descent to solve the objective function regularized by our combined regularization. Experimental results demonstrate the effectiveness of the proposed regularizer when comparing it with other baseline regularizers.
引用
收藏
页码:1248 / 1254
页数:7
相关论文
共 50 条
  • [1] The Group-Lasso: l1,∞ Regularization versus l1,2 Regularization
    Vogt, Julia E.
    Roth, Volker
    PATTERN RECOGNITION, 2010, 6376 : 252 - 261
  • [2] Transformed l1 regularization for learning sparse deep neural networks
    Ma, Rongrong
    Miao, Jianyu
    Niu, Lingfeng
    Zhang, Peng
    NEURAL NETWORKS, 2019, 119 : 286 - 298
  • [3] Training Compact DNNs with l1/2 Regularization
    Tang, Anda
    Niu, Lingfeng
    Miao, Jianyu
    Zhang, Peng
    PATTERN RECOGNITION, 2023, 136
  • [4] L1/2 regularization
    ZongBen Xu
    Hai Zhang
    Yao Wang
    XiangYu Chang
    Yong Liang
    Science China Information Sciences, 2010, 53 : 1159 - 1169
  • [5] L1/2 regularization
    XU ZongBen 1
    2 Department of Mathematics
    3 University of Science and Technology
    Science China(Information Sciences), 2010, 53 (06) : 1159 - 1169
  • [6] Structure Optimization of Neural Networks with L1 Regularization on Gates
    Chang, Qin
    Wang, Junze
    Zhang, Huaqing
    Shi, Lina
    Wang, Jian
    Pal, Nikhil R.
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 196 - 203
  • [7] l1 Regularization in Two-Layer Neural Networks
    Li, Gen
    Gu, Yuantao
    Ding, Jie
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 135 - 139
  • [8] Deep neural networks with L1 and L2 regularization for high dimensional corporate credit risk prediction
    Yang, Mei
    Lim, Ming K.
    Qu, Yingchi
    Li, Xingzhi
    Ni, Du
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213
  • [9] Smooth Group L1/2 Regularization for Pruning Convolutional Neural Networks
    Bao, Yuan
    Liu, Zhaobin
    Luo, Zhongxuan
    Yang, Sibo
    SYMMETRY-BASEL, 2022, 14 (01):
  • [10] Prune Deep Neural Networks With the Modified L1/2 Penalty
    Chang, Jing
    Sha, Jin
    IEEE ACCESS, 2019, 7 : 2273 - 2280