Decomposing neural networks as mappings of correlation functions

被引:10
|
作者
Fischer, Kirsten [1 ,2 ,3 ,4 ]
Rene, Alexandre [1 ,2 ,3 ,5 ,6 ]
Keup, Christian [1 ,2 ,3 ,4 ]
Layer, Moritz [1 ,2 ,3 ,4 ]
Dahmen, David [1 ,2 ,3 ]
Helias, Moritz [1 ,2 ,3 ,6 ]
机构
[1] Julich Res Ctr, Inst Neurosci & Med INM 6, D-52425 Julich, Germany
[2] Julich Res Ctr, Inst Adv Simulat IAS 6, D-52425 Julich, Germany
[3] Julich Res Ctr, JARA Inst Brain Struct Funct Relationships INM 10, D-52425 Julich, Germany
[4] Rhein Westfal TH Aachen, D-52062 Aachen, Germany
[5] Univ Ottawa, Dept Phys, Ottawa, ON K1N 6N5, Canada
[6] Rhein Westfal TH Aachen, Dept Phys, Fac 1, D-52074 Aachen, Germany
来源
PHYSICAL REVIEW RESEARCH | 2022年 / 4卷 / 04期
关键词
DEEP;
D O I
10.1103/PhysRevResearch.4.043143
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Understanding the functional principles of information processing in deep neural networks continues to be a challenge, in particular for networks with trained and thus nonrandom weights. To address this issue, we study the mapping between probability distributions implemented by a deep feed-forward network. We characterize this mapping as an iterated transformation of distributions, where the nonlinearity in each layer transfers information between different orders of correlation functions. This allows us to identify essential statistics in the data, as well as different information representations that can be used by neural networks. Applied to an XOR task and to MNIST, we show that correlations up to second order predominantly capture the information processing in the internal layers, while the input layer also extracts higher-order correlations from the data. This analysis provides a quantitative and explainable perspective on classification.
引用
收藏
页数:23
相关论文
共 50 条
  • [31] Canonical Correlation Analysis Neural Networks
    Fyfe, C
    Lai, PL
    15TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 2, PROCEEDINGS: PATTERN RECOGNITION AND NEURAL NETWORKS, 2000, : 977 - 980
  • [32] Quaternionic Recurrent Correlation Neural Networks
    Valle, Marcos Eduardo
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [33] Correlation coding in stochastic neural networks
    Ritz, R
    Sejnowski, TJ
    COMPUTATIONAL NEUROSCIENCE: TRENDS IN RESEARCH, 1998, : 497 - 502
  • [34] Interpolation functions of feedforward neural networks
    Li, HX
    Lee, ES
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2003, 46 (12) : 1861 - 1874
  • [35] Neural networks with multidimensional transfer functions
    Tsitouras, C
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2002, 13 (01): : 222 - 228
  • [36] Approximation by Neural Networks with Sigmoidal Functions
    Dan Sheng YU
    Acta Mathematica Sinica,English Series, 2013, (10) : 2013 - 2026
  • [37] Generative Neural Networks for Characteristic Functions
    Bruck, Florian
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2025,
  • [38] On the approximation of functions by tanh neural networks
    De Ryck, Tim
    Lanthaler, Samuel
    Mishra, Siddhartha
    NEURAL NETWORKS, 2021, 143 : 732 - 750
  • [39] Approximation by neural networks with sigmoidal functions
    Dan Sheng Yu
    Acta Mathematica Sinica, English Series, 2013, 29 : 2013 - 2026
  • [40] A study of functions distribution of neural networks
    Xiong, QY
    Hirasawa, K
    Hu, JL
    Murata, J
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 2361 - 2366