Deep Learning From Multiple Noisy Annotators as A Union

被引:14
|
作者
Wei, Hongxin [1 ]
Xie, Renchunzi [1 ]
Feng, Lei [2 ]
Han, Bo [3 ]
An, Bo [1 ]
机构
[1] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[2] Chongqing Univ, Coll Comp Sci, Chongqing 400044, Peoples R China
[3] Hong Kong Baptist Univ, Dept Comp Sci, Hong Kong, Peoples R China
基金
中国国家自然科学基金; 新加坡国家研究基金会;
关键词
Training; Deep learning; Labeling; Noise measurement; Neural networks; Standards; Learning systems; Annotators; crowdsourcing; noisy labels; transition matrix; CLASSIFICATION;
D O I
10.1109/TNNLS.2022.3168696
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Crowdsourcing is a popular solution for large-scale data annotations. So far, various end-to-end deep learning methods have been proposed to improve the practical performance of learning from crowds. Despite their practical effectiveness, most of them have two major limitations--they do not hold learning consistency and suffer from computational inefficiency. In this article, we propose a novel method named UnionNet, which is not only theoretically consistent but also experimentally effective and efficient. Specifically, unlike existing methods that either fit a given label from each annotator independently or fuse all the labels into a reliable one, we concatenate the one-hot encoded vectors of crowdsourced labels provided by all the annotators, which takes all the labeling information as a union and coordinates multiple annotators. In this way, we can directly train an end-to-end deep neural network by maximizing the likelihood of this union with only a parametric transition matrix. We theoretically prove the learning consistency and experimentally show the effectiveness and efficiency of our proposed method.
引用
收藏
页码:10552 / 10562
页数:11
相关论文
共 50 条
  • [21] Noisy Deep Dictionary Learning
    Singhal, Vanika
    Majumdar, Angshul
    PROCEEDINGS OF THE FOURTH ACM IKDD CONFERENCES ON DATA SCIENCES (CODS '17), 2017,
  • [22] Deep Learning From Noisy Image Labels With Quality Embedding
    Yao, Jiangchao
    Wang, Jiajie
    Tsang, Ivor W.
    Zhang, Ya
    Sun, Jun
    Zhang, Chengqi
    Zhang, Rui
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (04) : 1909 - 1922
  • [23] Learning From Noisy Labels With Deep Neural Networks: A Survey
    Song, Hwanjun
    Kim, Minseok
    Park, Dongmin
    Shin, Yooju
    Lee, Jae-Gil
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (11) : 8135 - 8153
  • [24] Learning Deep Networks from Noisy Labels with Dropout Regularization
    Jindal, Ishan
    Nokleby, Matthew
    Chen, Xuewen
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 967 - 972
  • [25] Impact of noisy annotators' reliability in a crowdsourcing system performance
    Cabrera-Bean, Margarita
    Diaz-Vilor, Caries
    Vidal, Josep
    2016 24TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2016, : 2005 - 2009
  • [26] Sequence labeling with multiple annotators
    Rodrigues, Filipe
    Pereira, Francisco
    Ribeiro, Bernardete
    MACHINE LEARNING, 2014, 95 (02) : 165 - 181
  • [27] Learning From Crowds With Multiple Noisy Label Distribution Propagation
    Jiang, Liangxiao
    Zhang, Hao
    Tao, Fangna
    Li, Chaoqun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (11) : 6558 - 6568
  • [28] Noisy Epistasis Using Deep Learning
    Ghanem, Sahar I.
    Ghanem, Nagia M.
    Ismail, Mohamed A.
    2018 PROCEEDINGS OF THE INTERNATIONAL JAPAN-AFRICA CONFERENCE ON ELECTRONICS, COMMUNICATIONS, AND COMPUTATIONS (JAC-ECC 2018), 2018, : 165 - 168
  • [29] DEEP LEARNING CLASSIFICATION WITH NOISY LABELS
    Sanchez, Guillaume
    Guis, Vincente
    Marxer, Ricard
    Bouchara, Frederic
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (ICMEW), 2020,
  • [30] Deep Learning For Noisy Communication System
    Mohamed, Reem E.
    Hunjet, Robert
    Elsayed, Saber
    Abbass, Hussein
    2021 31ST INTERNATIONAL TELECOMMUNICATION NETWORKS AND APPLICATIONS CONFERENCE (ITNAC), 2021, : 40 - 47