Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation

被引:0
|
作者
Wang, Qixuan [1 ]
Zhang, Yanjun [2 ]
Lu, Jun [2 ]
Li, Congsheng [1 ]
Zhang, Yungang [2 ]
机构
[1] China Acad Informat & Commun Technol, Beijing, Peoples R China
[2] Capital Med Univ, Beijing Chao Yang Hosp, Dept Pathol, Beijing 100020, Peoples R China
来源
PHYSICS IN MEDICINE AND BIOLOGY | 2024年 / 69卷 / 18期
关键词
lung adenocarcinoma; histopathology; whole slide image; image classification; semi-supervised learning; multi-teacher knowledge distillation; ASSOCIATION; PATTERN; CANCER;
D O I
10.1088/1361-6560/ad7454
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Objective. In this study, we propose a semi-supervised learning (SSL) scheme using a patch-based deep learning (DL) framework to tackle the challenge of high-precision classification of seven lung tumor growth patterns, despite having a small amount of labeled data in whole slide images (WSIs). This scheme aims to enhance generalization ability with limited data and reduce dependence on large amounts of labeled data. It effectively addresses the common challenge of high demand for labeled data in medical image analysis. Approach. To address these challenges, the study employs a SSL approach enhanced by a dynamic confidence threshold mechanism. This mechanism adjusts based on the quantity and quality of pseudo labels generated. This dynamic thresholding mechanism helps avoid the imbalance of pseudo-label categories and the low number of pseudo-labels that may result from a higher fixed threshold. Furthermore, the research introduces a multi-teacher knowledge distillation (MTKD) technique. This technique adaptively weights predictions from multiple teacher models to transfer reliable knowledge and safeguard student models from low-quality teacher predictions. Main results. The framework underwent rigorous training and evaluation using a dataset of 150 WSIs, each representing one of the seven growth patterns. The experimental results demonstrate that the framework is highly accurate in classifying lung tumor growth patterns in histopathology images. Notably, the performance of the framework is comparable to that of fully supervised models and human pathologists. In addition, the framework's evaluation metrics on a publicly available dataset are higher than those of previous studies, indicating good generalizability. Significance. This research demonstrates that a SSL approach can achieve results comparable to fully supervised models and expert pathologists, thus opening new possibilities for efficient and cost-effective medical images analysis. The implementation of dynamic confidence thresholding and MTKD techniques represents a significant advancement in applying DL to complex medical image analysis tasks. This advancement could lead to faster and more accurate diagnoses, ultimately improving patient outcomes and fostering the overall progress of healthcare technology.
引用
收藏
页数:16
相关论文
共 50 条
  • [11] Confidence-Guided Online Knowledge Distillation for Semi-supervised Medical Image Classification
    Qu, Aixi
    Wu, Qiang
    Yu, Luyue
    Liu, Ju
    ADVANCES IN SWARM INTELLIGENCE, PT II, ICSI 2024, 2024, 14789 : 245 - 257
  • [12] MTKDSR: Multi-Teacher Knowledge Distillation for Super Resolution Image Reconstruction
    Yao, Gengqi
    Li, Zhan
    Bhanu, Bir
    Kang, Zhiqing
    Zhong, Ziyi
    Zhang, Qingfeng
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 352 - 358
  • [13] Correlation Guided Multi-teacher Knowledge Distillation
    Shi, Luyao
    Jiang, Ning
    Tang, Jialiang
    Huang, Xinlei
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
  • [14] Reinforced Multi-Teacher Selection for Knowledge Distillation
    Yuan, Fei
    Shou, Linjun
    Pei, Jian
    Lin, Wutao
    Gong, Ming
    Fu, Yan
    Jiang, Daxin
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
  • [15] Knowledge Distillation via Multi-Teacher Feature Ensemble
    Ye, Xin
    Jiang, Rongxin
    Tian, Xiang
    Zhang, Rui
    Chen, Yaowu
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 566 - 570
  • [16] A Multi-teacher Knowledge Distillation Framework for Distantly Supervised Relation Extraction with Flexible Temperature
    Fei, Hongxiao
    Tan, Yangying
    Huang, Wenti
    Long, Jun
    Huang, Jincai
    Yang, Liu
    WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 103 - 116
  • [17] Multi-Teacher D-S Fusion for Semi-Supervised SAR Ship Detection
    Zhang, Xinzheng
    Li, Jinlin
    Li, Chao
    Liu, Guojin
    REMOTE SENSING, 2024, 16 (15)
  • [18] A Multi-Teacher Assisted Knowledge Distillation Approach for Enhanced Face Image Authentication
    Cheng, Tiancong
    Zhang, Ying
    Yin, Yifang
    Zimmermann, Roger
    Yu, Zhiwen
    Guo, Bin
    PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 135 - 143
  • [19] Semi-supervised Deep Linear Discriminant Analysis for Histopathology Image Classification
    Cui, Lei
    Feng, Jun
    Yang, Lin
    PROCEEDINGS 2018 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2018, : 2333 - 2337
  • [20] Multi-resolution consistency semi-supervised active learning framework for histopathology image classification
    Xie, Mingjian
    Geng, Yiqun
    Zhang, Weifeng
    Li, Shan
    Dong, Yuejiao
    Wu, Yongjun
    Tang, Hongzhong
    Hong, Liangli
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 259