Decomposing neural networks as mappings of correlation functions

被引:10
|
作者
Fischer, Kirsten [1 ,2 ,3 ,4 ]
Rene, Alexandre [1 ,2 ,3 ,5 ,6 ]
Keup, Christian [1 ,2 ,3 ,4 ]
Layer, Moritz [1 ,2 ,3 ,4 ]
Dahmen, David [1 ,2 ,3 ]
Helias, Moritz [1 ,2 ,3 ,6 ]
机构
[1] Julich Res Ctr, Inst Neurosci & Med INM 6, D-52425 Julich, Germany
[2] Julich Res Ctr, Inst Adv Simulat IAS 6, D-52425 Julich, Germany
[3] Julich Res Ctr, JARA Inst Brain Struct Funct Relationships INM 10, D-52425 Julich, Germany
[4] Rhein Westfal TH Aachen, D-52062 Aachen, Germany
[5] Univ Ottawa, Dept Phys, Ottawa, ON K1N 6N5, Canada
[6] Rhein Westfal TH Aachen, Dept Phys, Fac 1, D-52074 Aachen, Germany
来源
PHYSICAL REVIEW RESEARCH | 2022年 / 4卷 / 04期
关键词
DEEP;
D O I
10.1103/PhysRevResearch.4.043143
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Understanding the functional principles of information processing in deep neural networks continues to be a challenge, in particular for networks with trained and thus nonrandom weights. To address this issue, we study the mapping between probability distributions implemented by a deep feed-forward network. We characterize this mapping as an iterated transformation of distributions, where the nonlinearity in each layer transfers information between different orders of correlation functions. This allows us to identify essential statistics in the data, as well as different information representations that can be used by neural networks. Applied to an XOR task and to MNIST, we show that correlations up to second order predominantly capture the information processing in the internal layers, while the input layer also extracts higher-order correlations from the data. This analysis provides a quantitative and explainable perspective on classification.
引用
收藏
页数:23
相关论文
共 50 条
  • [11] Learning Aggregate Functions with Neural Networks Using a Cascade-Correlation Approach
    Uwents, Werner
    Blockeel, Hendrik
    INDUCTIVE LOGIC PROGRAMMING, ILP 2008, 2008, 5194 : 315 - 329
  • [12] Neural correlates of set-shifting: decomposing executive functions in schizophrenia
    Wilmsmeier, Andreas
    Ohrmann, Patricia
    Suslow, Thomas
    Siegmund, Ansgar
    Koelkebeck, Katja
    Rothermundt, Matthias
    Kugel, Harald
    Arolt, Volker
    Bauer, Jochen
    Pedersen, Anya
    JOURNAL OF PSYCHIATRY & NEUROSCIENCE, 2010, 35 (05): : 321 - 329
  • [13] Neural Networks and Rational Functions
    Telgarsky, Matus
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [14] ON THE APPROXIMATE REALIZATION OF CONTINUOUS-MAPPINGS BY NEURAL NETWORKS
    FUNAHASHI, K
    NEURAL NETWORKS, 1989, 2 (03) : 183 - 192
  • [15] Control of chaotic neural networks based on contraction mappings
    Yu, HJ
    Liu, YZ
    Peng, JH
    CHAOS SOLITONS & FRACTALS, 2004, 22 (04) : 787 - 792
  • [16] Mappings, dimensionality and reversing out of deep neural networks
    Cui, Zhaofang
    Grindrod, Peter
    IMA JOURNAL OF APPLIED MATHEMATICS, 2023, 89 (01) : 2 - 11
  • [17] Cascaded evolutionary algorithm for nonlinear system identification based on correlation functions and radial basis functions neural networks
    Hultmann Ayala, Helon Vicente
    Coelho, Leandro dos Santos
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2016, 68-69 : 378 - 393
  • [18] Decomposing bent functions
    Canteaut, A
    Charpin, P
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2003, 49 (08) : 2004 - 2019
  • [19] Decomposing bent functions
    Canteaut, A
    Charpin, P
    ISIT: 2002 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, PROCEEDINGS, 2002, : 42 - 42
  • [20] Knowledge Transfer via Decomposing Essential Information in Convolutional Neural Networks
    Lee, Seunghyun
    Song, Byung Cheol
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (01) : 366 - 377