Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation

被引:0
|
作者
Wang, Qixuan [1 ]
Zhang, Yanjun [2 ]
Lu, Jun [2 ]
Li, Congsheng [1 ]
Zhang, Yungang [2 ]
机构
[1] China Acad Informat & Commun Technol, Beijing, Peoples R China
[2] Capital Med Univ, Beijing Chao Yang Hosp, Dept Pathol, Beijing 100020, Peoples R China
来源
PHYSICS IN MEDICINE AND BIOLOGY | 2024年 / 69卷 / 18期
关键词
lung adenocarcinoma; histopathology; whole slide image; image classification; semi-supervised learning; multi-teacher knowledge distillation; ASSOCIATION; PATTERN; CANCER;
D O I
10.1088/1361-6560/ad7454
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Objective. In this study, we propose a semi-supervised learning (SSL) scheme using a patch-based deep learning (DL) framework to tackle the challenge of high-precision classification of seven lung tumor growth patterns, despite having a small amount of labeled data in whole slide images (WSIs). This scheme aims to enhance generalization ability with limited data and reduce dependence on large amounts of labeled data. It effectively addresses the common challenge of high demand for labeled data in medical image analysis. Approach. To address these challenges, the study employs a SSL approach enhanced by a dynamic confidence threshold mechanism. This mechanism adjusts based on the quantity and quality of pseudo labels generated. This dynamic thresholding mechanism helps avoid the imbalance of pseudo-label categories and the low number of pseudo-labels that may result from a higher fixed threshold. Furthermore, the research introduces a multi-teacher knowledge distillation (MTKD) technique. This technique adaptively weights predictions from multiple teacher models to transfer reliable knowledge and safeguard student models from low-quality teacher predictions. Main results. The framework underwent rigorous training and evaluation using a dataset of 150 WSIs, each representing one of the seven growth patterns. The experimental results demonstrate that the framework is highly accurate in classifying lung tumor growth patterns in histopathology images. Notably, the performance of the framework is comparable to that of fully supervised models and human pathologists. In addition, the framework's evaluation metrics on a publicly available dataset are higher than those of previous studies, indicating good generalizability. Significance. This research demonstrates that a SSL approach can achieve results comparable to fully supervised models and expert pathologists, thus opening new possibilities for efficient and cost-effective medical images analysis. The implementation of dynamic confidence thresholding and MTKD techniques represents a significant advancement in applying DL to complex medical image analysis tasks. This advancement could lead to faster and more accurate diagnoses, ultimately improving patient outcomes and fostering the overall progress of healthcare technology.
引用
收藏
页数:16
相关论文
共 50 条
  • [31] Semi-supervised medical image classification based on CamMix
    Guo, Lingchao
    Wang, Changjian
    Zhang, Dongsong
    Xu, Kele
    Huang, Zhen
    Luo, Li
    Peng, Yuxing
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [32] Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector
    Shang, Ronghua
    Li, Wenzheng
    Zhu, Songling
    Jiao, Licheng
    Li, Yangyang
    NEURAL NETWORKS, 2023, 164 : 345 - 356
  • [33] Device adaptation free-KDA based on multi-teacher knowledge distillation
    Yang, Yafang
    Guo, Bin
    Liang, Yunji
    Zhao, Kaixing
    Yu, Zhiwen
    Journal of Ambient Intelligence and Humanized Computing, 2024, 15 (10) : 3603 - 3615
  • [34] Multi-view semi-supervised learning for image classification
    Zhu, Songhao
    Sun, Xian
    Jin, Dongliang
    NEUROCOMPUTING, 2016, 208 : 136 - 142
  • [35] Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
    Zhang, Hailin
    Chen, Defang
    Wang, Can
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1943 - 1948
  • [36] Multi-teacher knowledge distillation for debiasing recommendation with uniform data
    Yang, Xinxin
    Li, Xinwei
    Liu, Zhen
    Yuan, Yafan
    Wang, Yannan
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 273
  • [37] ATMKD: adaptive temperature guided multi-teacher knowledge distillation
    Lin, Yu-e
    Yin, Shuting
    Ding, Yifeng
    Liang, Xingzhu
    MULTIMEDIA SYSTEMS, 2024, 30 (05)
  • [38] Reinforced Multi-teacher Knowledge Distillation for Unsupervised Sentence Representation
    Wang, Xintao
    Jin, Rize
    Qi, Shibo
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 320 - 332
  • [39] Semi-supervised Campus Network Intrusion Detection Based on Knowledge Distillation
    Chen, Junjun
    Guo, Qiang
    Fu, Zhongnan
    Shang, Qun
    Ma, Hao
    Wang, Nai
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [40] Adaptive multi-teacher softened relational knowledge distillation framework for payload mismatch in image steganalysis
    Yu, Lifang
    Li, Yunwei
    Weng, Shaowei
    Tian, Huawei
    Liu, Jing
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2023, 95