Semi-Supervised Deep Regression with Uncertainty Consistency and Variational Model Ensembling via Bayesian Neural Networks

被引:0
|
作者
Dai, Weihang [1 ]
Li, Xiaomeng [1 ,2 ]
Cheng, Kwang-Ting [1 ,2 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Comp Sci & Engn, Hong Kong, Peoples R China
[2] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep regression is an important problem with numerous applications. These range from computer vision tasks such as age estimation from photographs, to medical tasks such as ejection fraction estimation from echocardiograms for disease tracking. Semi-supervised approaches for deep regression are notably under-explored compared to classification and segmentation tasks, however. Unlike classification tasks, which rely on thresholding functions for generating class pseudo-labels, regression tasks use real number target predictions directly as pseudo-labels, making them more sensitive to prediction quality. In this work, we propose a novel approach to semi-supervised regression, namely Uncertainty-Consistent Variational Model Ensembling (UCVME), which improves training by generating high-quality pseudo-labels and uncertainty estimates for heteroscedastic regression. Given that aleatoric uncertainty is only dependent on input data by definition and should be equal for the same inputs, we present a novel uncertainty consistency loss for co-trained models. Our consistency loss significantly improves uncertainty estimates and allows higher quality pseudo-labels to be assigned greater importance under heteroscedastic regression. Furthermore, we introduce a novel variational model ensembling approach to reduce prediction noise and generate more robust pseudo-labels. We analytically show our method generates higher quality targets for unlabeled data and further improves training. Experiments show that our method outper-forms state-of-the-art alternatives on different tasks and can be competitive with supervised methods that use full labels. Code is available at https://github.com/xmed-lab/UCVME.
引用
收藏
页码:7304 / 7313
页数:10
相关论文
共 50 条
  • [1] SEMI-SUPERVISED TRAINING OF DEEP NEURAL NETWORKS
    Vesely, Karel
    Hannemann, Mirko
    Burget, Lukas
    2013 IEEE WORKSHOP ON AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING (ASRU), 2013, : 267 - 272
  • [2] Semi-Supervised Learning with Variational Bayesian Inference and Maximum Uncertainty Regularization
    Do, Kien
    Truyen Tran
    Venkatesh, Svetha
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7236 - 7244
  • [3] Semi-supervised Deep Domain Adaptation via Coupled Neural Networks
    Ding, Zhengming
    Nasrabadi, Nasser M.
    Fu, Yun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (11) : 5214 - 5224
  • [4] Uncertainty in Neural Networks: Approximately Bayesian Ensembling
    Pearce, Tim
    Leibfried, Felix
    Brintrup, Alexandra
    Zaki, Mohamed
    Neely, Andy
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 234 - 243
  • [5] SEMI-SUPERVISED LEARNING WITH DEEP NEURAL NETWORKS FOR RELATIVE TRANSFER FUNCTION INVERSE REGRESSION
    Wang, Ziteng
    Li, Junfeng
    Yan, Yonghong
    Vincent, Emmanuel
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 191 - 195
  • [6] SEMI-SUPERVISED TRAINING STRATEGIES FOR DEEP NEURAL NETWORKS
    Gibson, Matthew
    Cook, Gary
    Zhan, Puming
    2017 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU), 2017, : 77 - 83
  • [7] Posterior consistency of semi-supervised regression on graphs
    Bertozzi, Andrea L.
    Hosseini, Bamdad
    Li, Hao
    Miller, Kevin
    Stuart, Andrew M.
    INVERSE PROBLEMS, 2021, 37 (10)
  • [8] Bayesian Graph Convolutional Neural Networks for Semi-Supervised Classification
    Zhang, Yingxue
    Pal, Soumyasundar
    Coates, Mark
    Ustebay, Deniz
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 5829 - 5836
  • [9] Uncertainty Aware Temporal-Ensembling Model for Semi-Supervised ABUS Mass Segmentation
    Cao, Xuyang
    Chen, Houjin
    Li, Yanfeng
    Peng, Yahui
    Wang, Shu
    Cheng, Lin
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2021, 40 (01) : 431 - 443
  • [10] Semi-supervised regression with manifold: A Bayesian deep kernel learning approach
    Xu, Lu
    Hu, Chen
    Mei, Kuizhi
    NEUROCOMPUTING, 2022, 497 : 76 - 85