Transfer Entropy in Deep Neural Networks

被引:0
|
作者
Andonie, R. [1 ,2 ]
Cataron, A. [2 ,3 ]
Moldovan, A. [3 ]
机构
[1] Cent Washington Univ, Dept Comp Sci, Ellensburg, WA 98926 USA
[2] Transilvania Univ Brasov, Dept Elect & Comp, Brasov, Romania
[3] Siemens SRL, Siemens Res & Predev, Brasov, Romania
关键词
Transfer entropy; causality; deep learning; neural network explainability;
D O I
10.15837/ijccc.2025.1.6904
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper explores the application of Transfer Entropy (TE) in deep neural networks as a tool to improve training efficiency and analyze causal information flow. TE is a measure of directed information transfer that captures nonlinear dependencies and temporal dynamics between system components. The study investigates the use of TE in optimizing learning in Convolutional Neural Networks and Graph Convolutional Neural Networks. We present case studies that demonstrate reduced training times and improved accuracy. In addition, we apply TE within the framework of the Information Bottleneck theory, providing an insight into the trade-off between compression and information preservation during the training of deep learning architectures. The results highlight TE's potential for identifying causal features, improving explainability, and addressing challenges such as oversmoothing in Graph Convolutional Neural Networks. Although computational overhead and complexity pose challenges, the findings emphasize the role of TE in creating more efficient and interpretable neural models.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Transfer Entropy in Graph Convolutional Neural Networks
    Moldovan, Adrian
    Cataron, Angel
    Andonie, Azvan
    2024 28TH INTERNATIONAL CONFERENCE INFORMATION VISUALISATION, IV 2024, 2024, : 207 - 213
  • [2] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [4] Entropy-Constrained Training of Deep Neural Networks
    Wiedemann, Simon
    Marban, Arturo
    Mueller, Klaus-Robert
    Samek, Wojciech
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [5] On Local Entropy, Stochastic Control, and Deep Neural Networks
    Pavon, Michele
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 437 - 441
  • [6] Learning in Feedforward Neural Networks Accelerated by Transfer Entropy
    Moldovan, Adrian
    Cataron, Angel
    Andonie, Razvan
    ENTROPY, 2020, 22 (01) : 102
  • [7] Learning in Convolutional Neural Networks Accelerated by Transfer Entropy
    Moldovan, Adrian
    Cataron, Angel
    Andonie, Razvan
    ENTROPY, 2021, 23 (09)
  • [8] Parameter Transfer Unit for Deep Neural Networks
    Zhang, Yinghua
    Zhang, Yu
    Yang, Qiang
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2019, PT II, 2019, 11440 : 82 - 95
  • [9] DSNNs:learning transfer from deep neural networks to spiking neural networks
    张磊
    Du Zidong
    Li Ling
    Chen Yunji
    HighTechnologyLetters, 2020, 26 (02) : 136 - 144
  • [10] DSNNs: learning transfer from deep neural networks to spiking neural networks
    Zhang L.
    Du Z.
    Li L.
    Chen Y.
    High Technology Letters, 2020, 26 (02): : 136 - 144