Controlling information capacity of binary neural network

被引:13
|
作者
Ignatov, Dmitry [1 ]
Ignatov, Andrey [2 ]
机构
[1] Huawei Technol, Russian Res Ctr, Moscow 121614, Russia
[2] Swiss Fed Inst Technol, Dept Comp Sci, CH-8092 Zurich, Switzerland
关键词
Deep learning; Binary neural network; Information theory; Shannon entropy;
D O I
10.1016/j.patrec.2020.07.033
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the growing popularity of deep learning technologies, high memory requirements and power consumption are essentially limiting their application in mobile and IoT areas. While binary convolutional networks can alleviate these problems, the limited bitwidth of weights is often leading to significant degradation of prediction accuracy. In this paper, we present a method for training binary networks that maintains a stable predefined level of their information capacity throughout the training process by applying Shannon entropy based penalty to convolutional filters. The results of experiments conducted on the SVHN, CIFAR and ImageNet datasets demonstrate that the proposed approach can statistically significantly improve the accuracy of binary networks. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:276 / 281
页数:6
相关论文
共 50 条
  • [21] Rotated Binary Neural Network
    Lin, Mingbao
    Ji, Rongrong
    Xu, Zihan
    Zhang, Baochang
    Wang, Yan
    Wu, Yongjian
    Huang, Feiyue
    Lin, Chia-Wen
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [22] Hyperbolic Binary Neural Network
    Chen, Jun
    Xiang, Jingyang
    Huang, Tianxin
    Zhao, Xiangrui
    Liu, Yong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [23] Resilient Binary Neural Network
    Xu, Sheng
    Li, Yanjing
    Ma, Teli
    Lin, Mingbao
    Dong, Hao
    Zhang, Baochang
    Gao, Peng
    Lu, Jinhu
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 9, 2023, : 10620 - 10628
  • [24] INCREASED STORAGE CAPACITY FOR HIERARCHICALLY STRUCTURED INFORMATION IN A NEURAL NETWORK OF ISING TYPE
    IOFFE, LB
    KUHN, R
    VANHEMMEN, JL
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1989, 22 (21): : L1037 - L1041
  • [25] Revisiting the Information Capacity of Neural Network Watermarks: Upper Bound Estimation and Beyond
    Li, Fangqi
    Zhao, Haodong
    Du, Wei
    Wang, Shilin
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 19, 2024, : 21331 - 21339
  • [26] Controlling chaos in a chaotic neural network
    He, GG
    Cao, ZT
    Zhu, P
    Ogura, H
    NEURAL NETWORKS, 2003, 16 (08) : 1195 - 1200
  • [27] Controlling chaos in chaotic neural network
    He, G.G.
    Cao, Z.T.
    Wuli Xuebao/Acta Physica Sinica, 2001, 50 (11):
  • [28] Controlling chaos in chaotic neural network
    He, GG
    Cao, ZT
    ACTA PHYSICA SINICA, 2001, 50 (11) : 2103 - 2107
  • [29] An Information-Reserved and Deviation-Controllable Binary Neural Network for Object Detection
    Zhu, Ganlin
    Fei, Hongxiao
    Hong, Junkun
    Luo, Yueyi
    Long, Jun
    MATHEMATICS, 2023, 11 (01)
  • [30] Information capacity of binary weights associative memories
    Jagota, A
    Narasimhan, G
    Regan, KW
    NEUROCOMPUTING, 1998, 19 (1-3) : 35 - 58