Conceptual complexity of neural networks

被引:1
|
作者
Szymanski, Lech [1 ]
McCane, Brendan [1 ]
Atkinson, Craig [1 ]
机构
[1] Univ Otago, Dept Comp Sci, 133 Union St East, Dunedin, New Zealand
关键词
Deep learning; Learning theory; Complexity measures;
D O I
10.1016/j.neucom.2021.10.063
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a complexity measure of a neural network mapping function based on the order and diversity of the set of tangent spaces from different inputs. Treating each tangent space as a linear PAC concept we use an entropy-based measure of the bundle of concepts to estimate the conceptual capacity of the network. The theoretical maximal capacity of a ReLU network is equivalent to the number of its neurons. In practice, however, due to correlations between neuron activities within the network, the actual capacity can be remarkably small, even for very big networks. We formulate a new measure of conceptual complexity by normalising the capacity of the network by the degree of separation of concepts related to different classes. Empirical evaluations show that this new measure is correlated with the generalisation capabilities of the corresponding network. It captures the effective, as opposed to the theoretical, complexity of the network function. We also showcase some uses of the proposed measures for analysis and comparison of trained neural network models. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:52 / 64
页数:13
相关论文
共 50 条
  • [41] INFORMATION-THEORY, COMPLEXITY, AND NEURAL NETWORKS
    ABUMOSTAFA, YS
    IEEE COMMUNICATIONS MAGAZINE, 1989, 27 (11) : 25 - &
  • [42] Dropout Rademacher complexity of deep neural networks
    Wei Gao
    Zhi-Hua Zhou
    Science China Information Sciences, 2016, 59
  • [43] Energy Complexity Model for Convolutional Neural Networks
    Sima, Jiri
    Vidnerova, Petra
    Mrazek, Vojtech
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 186 - 198
  • [44] REDUCED-COMPLEXITY CIRCUIT FOR NEURAL NETWORKS
    WATKINS, SS
    CHAU, PM
    ELECTRONICS LETTERS, 1995, 31 (19) : 1644 - 1646
  • [45] Architectural Complexity Measures of Recurrent Neural Networks
    Zhang, Saizheng
    Wu, Yuhuai
    Che, Tong
    Lin, Zhouhan
    Memisevic, Roland
    Salakhutdinov, Ruslan
    Bengio, Yoshua
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [46] Dropout Rademacher complexity of deep neural networks
    Gao, Wei
    Zhou, Zhi-Hua
    SCIENCE CHINA-INFORMATION SCIENCES, 2016, 59 (07)
  • [47] CLASSES OF FEEDFORWARD NEURAL NETWORKS AND THEIR CIRCUIT COMPLEXITY
    SHAWETAYLOR, JS
    ANTHONY, MHG
    KERN, W
    NEURAL NETWORKS, 1992, 5 (06) : 971 - 977
  • [48] Embedding Complexity of Learned Representations in Neural Networks
    Kuzma, Tomas
    Farkas, Igor
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 : 518 - 528
  • [49] Quantizability and learning complexity in multilayer neural networks
    Fu, LM
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART C-APPLICATIONS AND REVIEWS, 1998, 28 (02): : 295 - 300
  • [50] On the complexity of computing and learning with multiplicative neural networks
    Schmitt, M
    NEURAL COMPUTATION, 2002, 14 (02) : 241 - 301