Decomposing neural networks as mappings of correlation functions

被引:10
|
作者
Fischer, Kirsten [1 ,2 ,3 ,4 ]
Rene, Alexandre [1 ,2 ,3 ,5 ,6 ]
Keup, Christian [1 ,2 ,3 ,4 ]
Layer, Moritz [1 ,2 ,3 ,4 ]
Dahmen, David [1 ,2 ,3 ]
Helias, Moritz [1 ,2 ,3 ,6 ]
机构
[1] Julich Res Ctr, Inst Neurosci & Med INM 6, D-52425 Julich, Germany
[2] Julich Res Ctr, Inst Adv Simulat IAS 6, D-52425 Julich, Germany
[3] Julich Res Ctr, JARA Inst Brain Struct Funct Relationships INM 10, D-52425 Julich, Germany
[4] Rhein Westfal TH Aachen, D-52062 Aachen, Germany
[5] Univ Ottawa, Dept Phys, Ottawa, ON K1N 6N5, Canada
[6] Rhein Westfal TH Aachen, Dept Phys, Fac 1, D-52074 Aachen, Germany
来源
PHYSICAL REVIEW RESEARCH | 2022年 / 4卷 / 04期
关键词
DEEP;
D O I
10.1103/PhysRevResearch.4.043143
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Understanding the functional principles of information processing in deep neural networks continues to be a challenge, in particular for networks with trained and thus nonrandom weights. To address this issue, we study the mapping between probability distributions implemented by a deep feed-forward network. We characterize this mapping as an iterated transformation of distributions, where the nonlinearity in each layer transfers information between different orders of correlation functions. This allows us to identify essential statistics in the data, as well as different information representations that can be used by neural networks. Applied to an XOR task and to MNIST, we show that correlations up to second order predominantly capture the information processing in the internal layers, while the input layer also extracts higher-order correlations from the data. This analysis provides a quantitative and explainable perspective on classification.
引用
收藏
页数:23
相关论文
共 50 条
  • [1] Improving the Performance of Cascade Correlation Neural Networks on Multimodal Functions
    Riley, Mike J. W.
    Thompson, Chris P.
    Jenkins, Karl W.
    WORLD CONGRESS ON ENGINEERING, WCE 2010, VOL III, 2010, : 1980 - 1986
  • [2] DECOMPOSING NEURAL NETWORKS INTO SYSTEMS OF COOPERATING SUBNETS
    MUKHERJEE, S
    WHITE, M
    IMAGES OF THE TWENTY-FIRST CENTURY, PTS 1-6, 1989, 11 : 2038 - 2039
  • [3] Nonlinear mappings with Cellular Neural Networks
    Fernandez-Munoz, J. Alvaro
    Preciado-Diaz, Victor M.
    Jaramillo-Moran, Miguel A.
    CURRENT TOPICS IN ARTIFICIAL INTELLIGENCE, 2006, 4177 : 350 - 359
  • [4] Set oriented mappings on neural networks
    Brouwer, RK
    Pedrycz, W
    SOFT COMPUTING, 2003, 8 (01) : 28 - 37
  • [5] Set oriented mappings on neural networks
    R. K. Brouwer
    W. Pedrycz
    Soft Computing, 2003, 8 : 28 - 37
  • [6] Dimension axioms and decomposing mappings
    Chatyrko V.A.
    Journal of Mathematical Sciences, 1999, 97 (3) : 4109 - 4115
  • [7] Decomposing Convolutional Neural Networks into Reusable and Replaceable Modules
    Pan, Rangeet
    Rajan, Hridesh
    2022 ACM/IEEE 44TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING (ICSE 2022), 2022, : 524 - 535
  • [8] Artificial Neural Networks as Mappings between Proton Potentials, Wave Functions, Densities, and Energy Levels
    Secor, Maxim
    Soudackov, Alexander, V
    Hammes-Schiffer, Sharon
    JOURNAL OF PHYSICAL CHEMISTRY LETTERS, 2021, 12 (09): : 2206 - 2212
  • [9] ON THE DESIGN OF FEEDFORWARD NEURAL NETWORKS FOR BINARY MAPPINGS
    TAN, SH
    VANDEWALLE, J
    NEUROCOMPUTING, 1994, 6 (5-6) : 565 - 582
  • [10] Hamiltonian Neural Networks based classifiers and mappings
    Sienko, W.
    Zamojski, D.
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 794 - +