Semi-Supervised Neuron Segmentation via Reinforced Consistency Learning

被引:35
|
作者
Huang, Wei [1 ,2 ]
Chen, Chang [1 ,2 ]
Xiong, Zhiwei [1 ,2 ]
Zhang, Yueyi [1 ,2 ]
Chen, Xuejin [1 ,2 ]
Sun, Xiaoyan [1 ,2 ]
Wu, Feng [1 ,2 ]
机构
[1] Univ Sci & Technol China, Dept Elect Engn & Informat Sci, Hefei 230027, Peoples R China
[2] Hefei Comprehens Natl Sci Ctr, Inst Artificial Intelligence, Hefei 230088, Peoples R China
基金
中国国家自然科学基金;
关键词
Neurons; Task analysis; Image segmentation; Data models; Perturbation methods; Training; Information filters; Neuron segmentation; deep learning; semi-supervised learning; electron microscopy images; ELECTRON-MICROSCOPY; RECONSTRUCTION;
D O I
10.1109/TMI.2022.3176050
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Emerging deep learning-based methods have enabled great progress in automatic neuron segmentation from Electron Microscopy (EM) volumes. However, the success of existing methods is heavily reliant upon a large number of annotations that are often expensive and time-consuming to collect due to dense distributions and complex structures of neurons. If the required quantity of manual annotations for learning cannot be reached, these methods turn out to be fragile. To address this issue, in this article, we propose a two-stage, semi-supervised learning method for neuron segmentation to fully extract useful information from unlabeled data. First, we devise a proxy task to enable network pre-training by reconstructing original volumes from their perturbed counterparts. This pre-training strategy implicitly extracts meaningful information on neuron structures from unlabeled data to facilitate the next stage of learning. Second, we regularize the supervised learning process with the pixel-level prediction consistencies between unlabeled samples and their perturbed counterparts. This improves the generalizability of the learned model to adapt diverse data distributions in EM volumes, especially when the number of labels is limited. Extensive experiments on representative EM datasets demonstrate the superior performance of our reinforced consistency learning compared to supervised learning, i.e., up to 400% gain on the VOI metric with only a few available labels. This is on par with a model trained on ten times the amount of labeled data in a supervised manner. Code is available at https://github.com/weih527/SSNS-Net.
引用
收藏
页码:3016 / 3028
页数:13
相关论文
共 50 条
  • [21] SEMI-SUPERVISED SEMANTIC SEGMENTATION CONSTRAINED BY CONSISTENCY REGULARIZATION
    Li, Xiaoqiang
    He, Qin
    Dai, Songmin
    Wu, Pin
    Tong, Weiqin
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [22] Decoupled Consistency for Semi-supervised Medical Image Segmentation
    Chen, Faquan
    Fei, Jingjing
    Chen, Yaqi
    Huang, Chenxi
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2023, PT I, 2023, 14220 : 551 - 561
  • [23] Neuron Segmentation based on CNN with Semi-supervised Regularization
    Xu, Kun
    Su, Hang
    Zhu, Jun
    Guan, Ji-Song
    Zhang, Bo
    PROCEEDINGS OF 29TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, (CVPRW 2016), 2016, : 1324 - 1332
  • [24] FMixAugment for Semi-supervised Learning with Consistency Regularization
    Lin, Huibin
    Wang, Shiping
    Liu, Zhanghui
    Xiao, Shunxin
    Du, Shide
    Guo, Wenzhong
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2021, PT II, 2021, 13020 : 127 - 139
  • [25] Revisiting Consistency Regularization for Semi-Supervised Learning
    Fan, Yue
    Kukleva, Anna
    Dai, Dengxin
    Schiele, Bernt
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2023, 131 (03) : 626 - 643
  • [26] ProMatch: Semi-Supervised Learning with Prototype Consistency
    Cheng, Ziyu
    Wang, Xianmin
    Li, Jing
    MATHEMATICS, 2023, 11 (16)
  • [27] Interpolation consistency training for semi-supervised learning
    Verma, Vikas
    Kawaguchi, Kenji
    Lamb, Alex
    Kannala, Juho
    Solin, Arno
    Bengio, Yoshua
    Lopez-Paz, David
    NEURAL NETWORKS, 2022, 145 : 90 - 106
  • [28] Interpolation Consistency Training for Semi-Supervised Learning
    Verma, Vikas
    Lamb, Alex
    Kannala, Juho
    Bengio, Yoshua
    Lopez-Paz, David
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3635 - 3641
  • [29] Semi-supervised learning with local and global consistency
    Gui, Jie
    Hu, Rongxiang
    Zhao, Zhongqiu
    Jia, Wei
    INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2014, 91 (11) : 2389 - 2402
  • [30] Revisiting Consistency Regularization for Semi-Supervised Learning
    Yue Fan
    Anna Kukleva
    Dengxin Dai
    Bernt Schiele
    International Journal of Computer Vision, 2023, 131 : 626 - 643