Decomposing neural networks as mappings of correlation functions

被引:10
|
作者
Fischer, Kirsten [1 ,2 ,3 ,4 ]
Rene, Alexandre [1 ,2 ,3 ,5 ,6 ]
Keup, Christian [1 ,2 ,3 ,4 ]
Layer, Moritz [1 ,2 ,3 ,4 ]
Dahmen, David [1 ,2 ,3 ]
Helias, Moritz [1 ,2 ,3 ,6 ]
机构
[1] Julich Res Ctr, Inst Neurosci & Med INM 6, D-52425 Julich, Germany
[2] Julich Res Ctr, Inst Adv Simulat IAS 6, D-52425 Julich, Germany
[3] Julich Res Ctr, JARA Inst Brain Struct Funct Relationships INM 10, D-52425 Julich, Germany
[4] Rhein Westfal TH Aachen, D-52062 Aachen, Germany
[5] Univ Ottawa, Dept Phys, Ottawa, ON K1N 6N5, Canada
[6] Rhein Westfal TH Aachen, Dept Phys, Fac 1, D-52074 Aachen, Germany
来源
PHYSICAL REVIEW RESEARCH | 2022年 / 4卷 / 04期
关键词
DEEP;
D O I
10.1103/PhysRevResearch.4.043143
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Understanding the functional principles of information processing in deep neural networks continues to be a challenge, in particular for networks with trained and thus nonrandom weights. To address this issue, we study the mapping between probability distributions implemented by a deep feed-forward network. We characterize this mapping as an iterated transformation of distributions, where the nonlinearity in each layer transfers information between different orders of correlation functions. This allows us to identify essential statistics in the data, as well as different information representations that can be used by neural networks. Applied to an XOR task and to MNIST, we show that correlations up to second order predominantly capture the information processing in the internal layers, while the input layer also extracts higher-order correlations from the data. This analysis provides a quantitative and explainable perspective on classification.
引用
收藏
页数:23
相关论文
共 50 条
  • [21] DRCNN: decomposing residual convolutional neural networks for time series forecasting
    Zhu, Yuzhen
    Luo, Shaojie
    Huang, Di
    Zheng, Weiyan
    Su, Fang
    Hou, Beiping
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [22] DRCNN: decomposing residual convolutional neural networks for time series forecasting
    Yuzhen Zhu
    Shaojie Luo
    Di Huang
    Weiyan Zheng
    Fang Su
    Beiping Hou
    Scientific Reports, 13
  • [23] DECOMPOSING BAIRE FUNCTIONS
    CICHON, J
    MORAYNE, M
    PAWLIKOWSKI, J
    SOLECKI, S
    JOURNAL OF SYMBOLIC LOGIC, 1991, 56 (04) : 1273 - 1283
  • [24] DECOMPOSING REPLICABLE FUNCTIONS
    McKay, J.
    Sevilla, David
    LMS JOURNAL OF COMPUTATION AND MATHEMATICS, 2008, 11 : 146 - 171
  • [25] Neural networks for approximation of real functions with the Gaussian functions
    Han, Xuli
    Hou, Muzhou
    ICNC 2007: THIRD INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION, VOL 1, PROCEEDINGS, 2007, : 601 - +
  • [26] STATE EVALUATION FUNCTIONS AND LYAPUNOV FUNCTIONS FOR NEURAL NETWORKS
    KOBUCHI, Y
    NEURAL NETWORKS, 1991, 4 (04) : 505 - 510
  • [27] Simple activation functions for neural and fuzzy neural networks
    Mendil, B
    Benmahammed, K
    ISCAS '99: PROCEEDINGS OF THE 1999 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL 5: SYSTEMS, POWER ELECTRONICS, AND NEURAL NETWORKS, 1999, : 347 - 350
  • [28] NEURAL NETWORKS WITH INTERVAL WEIGHTS FOR NONLINEAR MAPPINGS OF INTERVAL VECTORS
    KWON, K
    ISHIBUCHI, H
    TANAKA, H
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 1994, E77D (04) : 409 - 417
  • [29] Simple activation functions for neural and fuzzy neural networks
    Mendil, Boubekeur
    Benmahammed, K.
    Proceedings - IEEE International Symposium on Circuits and Systems, 1999, 5
  • [30] Deep Convolutional Neural Networks with Merge-and-Run Mappings
    Zhao, Liming
    Li, Mingjie
    Meng, Depu
    Li, Xi
    Zhang, Zhaoxiang
    Zhuang, Yueting
    Tu, Zhuowen
    Wang, Jingdong
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 3170 - 3176