Learning from Multiple Noisy Partial Labelers

被引:0
|
作者
Yu, Peilin [1 ]
Ding, Tiffany [2 ]
Bach, Stephen H. [1 ]
机构
[1] Brown Univ, Providence, RI 02912 USA
[2] Univ Calif Berkeley, Berkeley, CA USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Programmatic weak supervision creates models without hand-labeled training data by combining the outputs of heuristic labelers. Existing frameworks make the restrictive assumption that labelers output a single class label. Enabling users to create partial labelers that output subsets of possible class labels would greatly expand the expressivity of programmatic weak supervision. We introduce this capability by defining a probabilistic generative model that can estimate the underlying accuracies of multiple noisy partial labelers without ground truth labels. We show how to scale up learning, for example learning on 100k examples in one minute, a 300x speed up compared to a naive implementation. We also prove that this class of models is generically identifiable up to label swapping under mild conditions. We evaluate our framework on three text classification and six object classification tasks. On text tasks, adding partial labels increases average accuracy by 8.6 percentage points. On image tasks, we show that partial labels allow us to approach some zero-shot object classification problems with programmatic weak supervision by using class attributes as partial labelers. On these tasks, our framework has accuracy comparable to recent embedding-based zero-shot learning methods, while using only pre-trained attribute detectors.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Scalable Expert Selection when Learning from Noisy Labelers
    Wolley, Chirine
    Quafafou, Mohamed
    2013 12TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2013), VOL 1, 2013, : 398 - 401
  • [2] Repeated labeling using multiple noisy labelers
    Ipeirotis, Panagiotis G.
    Provost, Foster
    Sheng, Victor S.
    Wang, Jing
    DATA MINING AND KNOWLEDGE DISCOVERY, 2014, 28 (02) : 402 - 441
  • [3] Repeated labeling using multiple noisy labelers
    Panagiotis G. Ipeirotis
    Foster Provost
    Victor S. Sheng
    Jing Wang
    Data Mining and Knowledge Discovery, 2014, 28 : 402 - 441
  • [4] Learning from multiple annotators: Distinguishing good from random labelers
    Rodrigues, Filipe
    Pereira, Francisco
    Ribeiro, Bernardete
    PATTERN RECOGNITION LETTERS, 2013, 34 (12) : 1428 - 1436
  • [5] A Classification Model for Diverse and Noisy Labelers
    Sung, Hao-En
    Chen, Cheng-Kuan
    Xiao, Han
    Lin, Shou-De
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2017, PT I, 2017, 10234 : 58 - 69
  • [6] Active Learning With Noisy Labelers for Improving Classification Accuracy of Connected Vehicles
    Abdellatif, Alaa Awad
    Chiasserini, Carla Fabiana
    Malandrino, Francesco
    Mohamed, Amr
    Erbad, Aiman
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2021, 70 (04) : 3059 - 3070
  • [7] Active Learning from Imperfect Labelers
    Yan, Songbai
    Chaudhuri, Kamalika
    Javidi, Tara
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [8] Proactive Learning with Multiple Class-Sensitive Labelers
    Moon, Seungwhan
    Carbonell, Jaime G.
    2014 INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2014, : 32 - 38
  • [9] Active Learning from Weak and Strong Labelers
    Zhang, Chicheng
    Chaudhuri, Kamalika
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [10] Deep Learning From Multiple Noisy Annotators as A Union
    Wei, Hongxin
    Xie, Renchunzi
    Feng, Lei
    Han, Bo
    An, Bo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 10552 - 10562