Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision

被引:6
|
作者
Javed, Sajid [1 ]
Mahmood, Arif [2 ]
Qaiser, Talha [3 ]
Werghi, Naoufel [1 ]
机构
[1] Khalifa Univ Sci & Technol, Dept Elect Engn & Comp Sci, Abu Dhabi, U Arab Emirates
[2] Informat Technol Univ, Dept Comp Sci, Lahore 54000, Pakistan
[3] Univ Warwick, Dept Comp Sci, Coventry CV4 7AL, England
关键词
Cancer; Training; Knowledge engineering; Histopathology; Task analysis; Predictive models; Neural networks; Knowledge distillation; features distillation; histology image classification; tissue phenotyping; FRAMEWORK; NETWORK;
D O I
10.1109/JBHI.2023.3237749
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic tissue classification is a fundamental task in computational pathology for profiling tumor micro-environments. Deep learning has advanced tissue classification performance at the cost of significant computational power. Shallow networks have also been end-to-end trained using direct supervision however their performance degrades because of the lack of capturing robust tissue heterogeneity. Knowledge distillation has recently been employed to improve the performance of the shallow networks used as student networks by using additional supervision from deep neural networks used as teacher networks. In the current work, we propose a novel knowledge distillation algorithm to improve the performance of shallow networks for tissue phenotyping in histology images. For this purpose, we propose multi-layer feature distillation such that a single layer in the student network gets supervision from multiple teacher layers. In the proposed algorithm, the size of the feature map of two layers is matched by using a learnable multi-layer perceptron. The distance between the feature maps of the two layers is then minimized during the training of the student network. The overall objective function is computed by summation of the loss over multiple layers combination weighted with a learnable attention-based parameter. The proposed algorithm is named as Knowledge Distillation for Tissue Phenotyping (KDTP). Experiments are performed on five different publicly available histology image classification datasets using several teacher-student network combinations within the KDTP algorithm. Our results demonstrate a significant performance increase in the student networks by using the proposed KDTP algorithm compared to direct supervision-based training methods.
引用
收藏
页码:2037 / 2046
页数:10
相关论文
共 50 条
  • [41] Assembling engineering knowledge in a modular multi-layer perceptron neural network
    Jansen, WJ
    Diepenhorst, M
    Nijhuis, JAG
    Spaanenburg, L
    1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 232 - 237
  • [42] Knowledge-based control of reactive systems with multi-layer architecture
    Matyasik, P.
    Nalepa, G. J.
    MIXDES 2007: Proceedings of the 14th International Conference on Mixed Design of Integrated Circuits and Systems:, 2007, : 667 - 672
  • [43] Preparing lessons: Improve knowledge distillation with better supervision
    Wen, Tiancheng
    Lai, Shenqi
    Qian, Xueming
    NEUROCOMPUTING, 2021, 454 : 25 - 33
  • [44] Multi-layer parallel shooting method for multi-layer boundary value problems
    Allan, Fathi M.
    Hajji, Mohamed Ali
    2009 INTERNATIONAL CONFERENCE ON INNOVATIONS IN INFORMATION TECHNOLOGY, 2009, : 315 - 319
  • [45] Achieving maximum recovery of latent heat in photothermally driven multi-layer stacked membrane distillation
    Ghim, Deoukchen
    Wu, Xuanhao
    Suazo, Mathew
    Jun, Young-Shin
    NANO ENERGY, 2021, 80
  • [46] Multi-layer ear-scalp distillation framework for ear-EEG classification enhancement
    Sun, Ying
    Zhang, Feiyang
    Li, Ziyu
    Liu, Xiaolin
    Zheng, Dezhi
    Zhang, Shuailei
    Fan, Shangchun
    Wu, Xia
    JOURNAL OF NEURAL ENGINEERING, 2024, 21 (06)
  • [47] Performance prediction of vacuum membrane distillation system based on multi-layer perceptron neural network
    Si, Zetian
    Li, Zhuohao
    Li, Ke
    Li, Zhiwei
    Wang, Gang
    DESALINATION, 2025, 602
  • [48] Price Prediction of Cryptocurrency Using a Multi-Layer Gated Recurrent Unit Network with Multi Features
    Gyana Ranjan Patra
    Mihir Narayan Mohanty
    Computational Economics, 2023, 62 : 1525 - 1544
  • [49] Price Prediction of Cryptocurrency Using a Multi-Layer Gated Recurrent Unit Network with Multi Features
    Patra, Gyana Ranjan
    Mohanty, Mihir Narayan
    COMPUTATIONAL ECONOMICS, 2023, 62 (04) : 1525 - 1544
  • [50] Multi-Attributed Graph Matching With Multi-Layer Graph Structure and Multi-Layer Random Walks
    Park, Han-Mu
    Yoon, Kuk-Jin
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (05) : 2314 - 2325