Transfer Entropy in Deep Neural Networks

被引:0
|
作者
Andonie, R. [1 ,2 ]
Cataron, A. [2 ,3 ]
Moldovan, A. [3 ]
机构
[1] Cent Washington Univ, Dept Comp Sci, Ellensburg, WA 98926 USA
[2] Transilvania Univ Brasov, Dept Elect & Comp, Brasov, Romania
[3] Siemens SRL, Siemens Res & Predev, Brasov, Romania
关键词
Transfer entropy; causality; deep learning; neural network explainability;
D O I
10.15837/ijccc.2025.1.6904
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper explores the application of Transfer Entropy (TE) in deep neural networks as a tool to improve training efficiency and analyze causal information flow. TE is a measure of directed information transfer that captures nonlinear dependencies and temporal dynamics between system components. The study investigates the use of TE in optimizing learning in Convolutional Neural Networks and Graph Convolutional Neural Networks. We present case studies that demonstrate reduced training times and improved accuracy. In addition, we apply TE within the framework of the Information Bottleneck theory, providing an insight into the trade-off between compression and information preservation during the training of deep learning architectures. The results highlight TE's potential for identifying causal features, improving explainability, and addressing challenges such as oversmoothing in Graph Convolutional Neural Networks. Although computational overhead and complexity pose challenges, the findings emphasize the role of TE in creating more efficient and interpretable neural models.
引用
收藏
页数:20
相关论文
共 50 条
  • [31] Layer Removal for Transfer Learning with Deep Convolutional Neural Networks
    Zhi, Weiming
    Chen, Zhenghao
    Yueng, Henry Wing Fung
    Lu, Zhicheng
    Zandavi, Seid Miad
    Chung, Yuk Ying
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT II, 2017, 10635 : 460 - 469
  • [32] Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
    Zhang, Zhilu
    Sabuncu, Mert R.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [33] Graph neural networks and transfer entropy enhance forecasting of mesozooplankton community dynamics
    Jeung, Minhyuk
    Jang, Min-Chul
    Shin, Kyoungsoon
    Jung, Seung Won
    Baek, Sang-Soo
    ENVIRONMENTAL SCIENCE AND ECOTECHNOLOGY, 2025, 23
  • [34] Power Control in massive MIMO Networks using Transfer Learning with Deep Neural Networks
    Ahmadi, Neda
    Mporas, Iosif
    Papazafeiropoulos, Anastasios
    Kourtessis, Pandelis
    Senior, John
    2022 IEEE 27TH INTERNATIONAL WORKSHOP ON COMPUTER AIDED MODELING AND DESIGN OF COMMUNICATION LINKS AND NETWORKS (CAMAD), 2022, : 89 - 93
  • [35] Computationally Efficient Training of Deep Neural Networks via Transfer Learning
    Oyen, Diane
    REAL-TIME IMAGE PROCESSING AND DEEP LEARNING 2019, 2019, 10996
  • [36] Deep Neural Networks with Transfer Learning Model for Brain Tumors Classification
    Bulla, Premamayudu
    Anantha, Lakshmipathi
    Peram, Subbarao
    TRAITEMENT DU SIGNAL, 2020, 37 (04) : 593 - 601
  • [37] Deep Convolutional Neural Networks with Transfer Learning for Visual Sentiment Analysis
    Devi, K. Usha Kingsly
    Gomathi, V
    NEURAL PROCESSING LETTERS, 2023, 55 (04) : 5087 - 5120
  • [38] Virtual Stain Transfer in Histology via Cascaded Deep Neural Networks
    Yang, Xilin
    Bai, Bijie
    Zhang, Yijie
    Li, Yuzhu
    de Haan, Kevin
    Liu, Tairan
    Ozcan, Aydogan
    ACS PHOTONICS, 2022, 9 (09) : 3134 - 3143
  • [39] Flexible Bayesian Inference by Weight Transfer for Robust Deep Neural Networks
    Thi Thu Thao Khong
    Nakada, Takashi
    Nakashima, Yasuhiko
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2021, E104D (11) : 1981 - 1991
  • [40] Transfer learning approach in deep neural networks for uterine fibroid detection
    Sundar, Sumod
    Sumathy, S.
    INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING, 2022, 25 (01) : 52 - 63