Transfer Entropy in Deep Neural Networks

被引:0
|
作者
Andonie, R. [1 ,2 ]
Cataron, A. [2 ,3 ]
Moldovan, A. [3 ]
机构
[1] Cent Washington Univ, Dept Comp Sci, Ellensburg, WA 98926 USA
[2] Transilvania Univ Brasov, Dept Elect & Comp, Brasov, Romania
[3] Siemens SRL, Siemens Res & Predev, Brasov, Romania
关键词
Transfer entropy; causality; deep learning; neural network explainability;
D O I
10.15837/ijccc.2025.1.6904
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper explores the application of Transfer Entropy (TE) in deep neural networks as a tool to improve training efficiency and analyze causal information flow. TE is a measure of directed information transfer that captures nonlinear dependencies and temporal dynamics between system components. The study investigates the use of TE in optimizing learning in Convolutional Neural Networks and Graph Convolutional Neural Networks. We present case studies that demonstrate reduced training times and improved accuracy. In addition, we apply TE within the framework of the Information Bottleneck theory, providing an insight into the trade-off between compression and information preservation during the training of deep learning architectures. The results highlight TE's potential for identifying causal features, improving explainability, and addressing challenges such as oversmoothing in Graph Convolutional Neural Networks. Although computational overhead and complexity pose challenges, the findings emphasize the role of TE in creating more efficient and interpretable neural models.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] A Deep Learning Framework for Automated Transfer Learning of Neural Networks
    Balaiah, Thanasekhar
    Jeyadoss, Timothy Jones Thomas
    Thirumurugan, Sainee
    Ravi, Rahul Chander
    2019 11TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTING (ICOAC 2019), 2019, : 428 - 432
  • [22] Transfer Learning for Latin and Chinese Characters with Deep Neural Networks
    Ciresan, Dan C.
    Meier, Ueli
    Schmidhuber, Juergen
    2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [23] Japanese animation style transfer using deep neural networks
    Ye, Shiyang
    Ohtera, Ryo
    Proceedings of the 2017 IEEE International Conference on Information, Communication and Engineering: Information and Innovation for Modern Technology, ICICE 2017, 2018, : 492 - 495
  • [24] Transfer learning for gene expression prediction with deep neural networks
    Arslan, Emre
    Rai, Kunal
    CANCER RESEARCH, 2020, 80 (16)
  • [25] Deep neural networks with transfer learning in millet crop images
    Coulibaly, Solemane
    Kamsu-Foguem, Bernard
    Kamissoko, Dantouma
    Traore, Daouda
    COMPUTERS IN INDUSTRY, 2019, 108 : 115 - 120
  • [26] Research on Task Discovery for Transfer Learning in Deep Neural Networks
    Akdemir, Arda
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020): STUDENT RESEARCH WORKSHOP, 2020, : 33 - 41
  • [27] Knowledge Transfer in Deep Block-Modular Neural Networks
    Terekhov, Alexander V.
    Montone, Guglielmo
    O'Regan, J. Kevin
    BIOMIMETIC AND BIOHYBRID SYSTEMS, LIVING MACHINES 2015, 2015, 9222 : 268 - 279
  • [28] ATTL: An Automated Targeted Transfer Learning with Deep Neural Networks
    Ahamed, Sayyed Farid
    Aggarwal, Priyanka
    Shetty, Sachin
    Lanus, Erin
    Freeman, Laura J.
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [29] Japanese Animation Style Transfer using Deep Neural Networks
    Ye, Shiyang
    Ohtera, Ryo
    PROCEEDINGS OF THE 2017 IEEE INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATION AND ENGINEERING (IEEE-ICICE 2017), 2017, : 492 - 495
  • [30] A Transfer Learning Evaluation of Deep Neural Networks for Image Classification
    Abou Baker, Nermeen
    Zengeler, Nico
    Handmann, Uwe
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2022, 4 (01): : 22 - 41