Uncertainty Driven Adaptive Self-Knowledge Distillation for Medical Image Segmentation

被引:0
|
作者
Guo, Xutao [1 ,2 ]
Wang, Mengqi [3 ]
Xiang, Yang [2 ]
Yang, Yanwu [1 ,2 ]
Ye, Chenfei [4 ,5 ]
Wang, Haijun [6 ,7 ]
Ma, Ting [4 ,8 ]
机构
[1] Harbin Inst Technol Shenzhen, Sch Elect & Informat Engn, Shenzhen 518055, Peoples R China
[2] Peng Cheng Lab, Shenzhen 518066, Peoples R China
[3] Shenzhen Univ, Shenzhen Peoples Hosp 2, Affiliated Hosp 1, Dept Neurooncol, Shenzhen 518037, Peoples R China
[4] Harbin Inst Technol Shenzhen, Sch Biomed Engn & Digital Hlth, Shenzhen 518055, Peoples R China
[5] Harbin Inst Technol, Int Res Inst Artificial Intelligence, Shenzhen 518055, Peoples R China
[6] Sun Yat Sen Univ, Affiliated Hosp 6, Dept Neurosurg, Guangzhou 510655, Peoples R China
[7] First Affiliated Hosp Sun Yat Sen, Dept Neurosurg, Guangzhou 510060, Peoples R China
[8] Harbin Inst Technol, Guangdong Prov Key Lab Aerosp Commun & Networking, Shenzhen 518055, Peoples R China
基金
中国国家自然科学基金;
关键词
Biomedical imaging; Training; Predictive models; Image segmentation; Adaptation models; Knowledge engineering; Estimation; Semantics; Computational modeling; Medical image segmentation; overfitting; knowledge distillation; uncertainty; cyclic ensembles; NEURAL-NETWORKS;
D O I
10.1109/TETCI.2025.3526259
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning has recently significantly improved the precision of medical image segmentation. However, due to the commonly limited dataset scale and reliance on hard labels (one-hot vectors) in medical image segmentation, deep learning models often overfit, which reduces segmentation performance. To mitigate the problem, we propose an uncertainty driven adaptive self-knowledge distillation (UAKD) model for medical image segmentation that regularizes the training process through self-generated soft labels. The innovation of UAKD is to integrate uncertainty estimation into soft label generation and student network training, ensuring accurate supervision and effective regularization. In detail, UAKD introduce teacher network ensembling to reduce semantic bias in soft labels caused by the teacher networks' fitting biases. An adaptive knowledge distillation mechanism is also proposed, which utilizes uncertainty to generate adaptive weights for soft labels to compute the loss function, thereby efficiently transferring reliable knowledge from the teacher network to the student network while suppressing unreliable information. Finally, we introduce a gradient ascent based cyclic ensemble method to reduce teacher network overfitting on the training data, further enhancing the aforementioned teacher ensembling and uncertainty estimation. Experiments on three medical image segmentation tasks show that UAKD outperforms existing regularization methods and demonstrates the effectiveness of uncertainty estimation for assessing soft label reliability.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Adversarial class-wise self-knowledge distillation for medical image segmentation
    Xiangchun Yu
    Jiaqing Shen
    Dingwen Zhang
    Jian Zheng
    Scientific Reports, 15 (1)
  • [2] Adaptive lightweight network construction method for Self-Knowledge Distillation
    Lu, Siyuan
    Zeng, Weiliang
    Li, Xueshi
    Ou, Jiajun
    NEUROCOMPUTING, 2025, 624
  • [3] MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition
    Yang, Chuanguang
    An, Zhulin
    Zhou, Helong
    Cai, Linhang
    Zhi, Xiang
    Wu, Jiwen
    Xu, Yongjun
    Zhang, Qian
    COMPUTER VISION, ECCV 2022, PT XXIV, 2022, 13684 : 534 - 551
  • [4] Neighbor self-knowledge distillation
    Liang, Peng
    Zhang, Weiwei
    Wang, Junhuang
    Guo, Yufeng
    INFORMATION SCIENCES, 2024, 654
  • [5] Efficient Medical Image Segmentation Based on Knowledge Distillation
    Qin, Dian
    Bu, Jia-Jun
    Liu, Zhe
    Shen, Xin
    Zhou, Sheng
    Gu, Jing-Jun
    Wang, Zhi-Hua
    Wu, Lei
    Dai, Hui-Fen
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2021, 40 (12) : 3820 - 3831
  • [6] Self-knowledge distillation with dimensional history knowledge
    Wenke Huang
    Mang Ye
    Zekun Shi
    He Li
    Bo Du
    Science China Information Sciences, 2025, 68 (9)
  • [7] SELF-KNOWLEDGE, UNCERTAINTY, AND CHOICE
    SCHICK, F
    BRITISH JOURNAL FOR THE PHILOSOPHY OF SCIENCE, 1979, 30 (03): : 235 - 252
  • [8] A Lightweight Convolution Network with Self-Knowledge Distillation for Hyperspectral Image Classification
    Xu, Hao
    Cao, Guo
    Deng, Lindiao
    Ding, Lanwei
    Xu, Ling
    Pan, Qikun
    Shang, Yanfeng
    FOURTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2022, 2022, 12705
  • [9] Self-knowledge distillation via dropout
    Lee, Hyoje
    Park, Yeachan
    Seo, Hyun
    Kang, Myungjoo
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 233
  • [10] Prototype-wise self-knowledge distillation for few-shot segmentation
    Chen, Yadang
    Xu, Xinyu
    Wei, Chenchen
    Lu, Chuhan
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2024, 129