Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation

被引:0
|
作者
Wang, Qixuan [1 ]
Zhang, Yanjun [2 ]
Lu, Jun [2 ]
Li, Congsheng [1 ]
Zhang, Yungang [2 ]
机构
[1] China Acad Informat & Commun Technol, Beijing, Peoples R China
[2] Capital Med Univ, Beijing Chao Yang Hosp, Dept Pathol, Beijing 100020, Peoples R China
来源
PHYSICS IN MEDICINE AND BIOLOGY | 2024年 / 69卷 / 18期
关键词
lung adenocarcinoma; histopathology; whole slide image; image classification; semi-supervised learning; multi-teacher knowledge distillation; ASSOCIATION; PATTERN; CANCER;
D O I
10.1088/1361-6560/ad7454
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Objective. In this study, we propose a semi-supervised learning (SSL) scheme using a patch-based deep learning (DL) framework to tackle the challenge of high-precision classification of seven lung tumor growth patterns, despite having a small amount of labeled data in whole slide images (WSIs). This scheme aims to enhance generalization ability with limited data and reduce dependence on large amounts of labeled data. It effectively addresses the common challenge of high demand for labeled data in medical image analysis. Approach. To address these challenges, the study employs a SSL approach enhanced by a dynamic confidence threshold mechanism. This mechanism adjusts based on the quantity and quality of pseudo labels generated. This dynamic thresholding mechanism helps avoid the imbalance of pseudo-label categories and the low number of pseudo-labels that may result from a higher fixed threshold. Furthermore, the research introduces a multi-teacher knowledge distillation (MTKD) technique. This technique adaptively weights predictions from multiple teacher models to transfer reliable knowledge and safeguard student models from low-quality teacher predictions. Main results. The framework underwent rigorous training and evaluation using a dataset of 150 WSIs, each representing one of the seven growth patterns. The experimental results demonstrate that the framework is highly accurate in classifying lung tumor growth patterns in histopathology images. Notably, the performance of the framework is comparable to that of fully supervised models and human pathologists. In addition, the framework's evaluation metrics on a publicly available dataset are higher than those of previous studies, indicating good generalizability. Significance. This research demonstrates that a SSL approach can achieve results comparable to fully supervised models and expert pathologists, thus opening new possibilities for efficient and cost-effective medical images analysis. The implementation of dynamic confidence thresholding and MTKD techniques represents a significant advancement in applying DL to complex medical image analysis tasks. This advancement could lead to faster and more accurate diagnoses, ultimately improving patient outcomes and fostering the overall progress of healthcare technology.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Adversarial Multi-Teacher Distillation for Semi-Supervised Relation Extraction
    Li, Wanli
    Qian, Tieyun
    Li, Xuhui
    Zou, Lixin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (08) : 11291 - 11301
  • [2] Ensemble Knowledge Distillation for Federated Semi-Supervised Image Classification
    Shang, Ertong
    Liu, Hui
    Zhang, Jingyang
    Zhao, Runqi
    Du, Junzhao
    TSINGHUA SCIENCE AND TECHNOLOGY, 2025, 30 (01): : 112 - 123
  • [3] Multi-teacher Self-training for Semi-supervised Node Classification with Noisy Labels
    Liu, Yujing
    Wu, Zongqian
    Lu, Zhengyu
    Wen, Guoqiu
    Ma, Junbo
    Lu, Guangquan
    Zhu, Xiaofeng
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 2946 - 2954
  • [4] Decoupled Multi-teacher Knowledge Distillation based on Entropy
    Cheng, Xin
    Tang, Jialiang
    Zhang, Zhiqiang
    Yu, Wenxin
    Jiang, Ning
    Zhou, Jinjia
    2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
  • [5] Anomaly detection based on multi-teacher knowledge distillation
    Ma, Ye
    Jiang, Xu
    Guan, Nan
    Yi, Wang
    JOURNAL OF SYSTEMS ARCHITECTURE, 2023, 138
  • [6] Knowledge distillation-driven semi-supervised multi-view classification
    Wang, Xiaoli
    Wang, Yongli
    Ke, Guanzhou
    Wang, Yupeng
    Hong, Xiaobin
    INFORMATION FUSION, 2024, 103
  • [7] Semi-Supervised Image Deraining Using Knowledge Distillation
    Cui, Xin
    Wang, Cong
    Ren, Dongwei
    Chen, Yunjin
    Zhu, Pengfei
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (12) : 8327 - 8341
  • [8] MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution
    Jiang, Yuxuan
    Feng, Chen
    Zhang, Fan
    Bull, David
    COMPUTER VISION - ECCV 2024, PT XXXIX, 2025, 15097 : 364 - 382
  • [9] Knowledge-based semi-supervised satellite image classification
    Al Momani, Bilal
    Morrow, Philip
    McClean, Sally
    2007 9TH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, VOLS 1-3, 2007, : 264 - 267
  • [10] Certainty driven consistency loss on multi-teacher networks for semi-supervised learning
    Liu, Lu
    Tan, Robby T.
    PATTERN RECOGNITION, 2021, 120