Real: A Representative Error-Driven Approach for Active Learning

被引:1
|
作者
Chen, Cheng [1 ]
Wang, Yong [2 ]
Liao, Lizi [2 ]
Chen, Yueguo [1 ]
Du, Xiaoyong [1 ]
机构
[1] Renmin Univ China, Beijing, Peoples R China
[2] Singapore Management Univ, Singapore, Singapore
基金
美国国家科学基金会;
关键词
Active learning; Text classification; Error-driven;
D O I
10.1007/978-3-031-43412-9_2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Given a limited labeling budget, active learning (al) aims to sample the most informative instances from an unlabeled pool to acquire labels for subsequent model training. To achieve this, al typically measures the informativeness of unlabeled instances based on uncertainty and diversity. However, it does not consider erroneous instances with their neighborhood error density, which have great potential to improve the model performance. To address this limitation, we propose Real, a novel approach to select data instances with Representative Errors for Active Learning. It identifies minority predictions as pseudo errors within a cluster and allocates an adaptive sampling budget for the cluster based on estimated error density. Extensive experiments on five text classification datasets demonstrate that Real consistently outperforms all best-performing baselines regarding accuracy and F1-macro scores across a wide range of hyperparameter settings. Our analysis also shows that Real selects the most representative pseudo errors that match the distribution of ground-truth errors along the decision boundary. Our code is publicly available at https://github.com/withchencheng/ ECML PKDD 23 Real.
引用
收藏
页码:20 / 37
页数:18
相关论文
共 50 条
  • [31] Towards More Biologically Plausible Error-Driven Learning for Artificial Neural Networks
    Malinovska, Kristina
    Malinovsky, Ludovit
    Farkas, Igor
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT III, 2018, 11141 : 228 - 231
  • [32] Error-driven learning in Optimality Theory via the efficient computation of optimal forms
    Tesar, BB
    IS THE BEST GOOD ENOUGH?: OPTIMALITY AND COMPETITION IN SYNTAX, 1998, : 421 - 435
  • [33] Prediction Error-Driven Memory Consolidation for Continual Learning: On the Case of Adaptive Greenhouse Models
    Guido Schillaci
    Uwe Schmidt
    Luis Miranda
    KI - Künstliche Intelligenz, 2021, 35 : 71 - 80
  • [34] Error-Driven Learning in Visual Categorization and Object Recognition: A Common-Elements Model
    Soto, Fabian A.
    Wasserman, Edward A.
    PSYCHOLOGICAL REVIEW, 2010, 117 (02) : 349 - 381
  • [35] An exploration of error-driven learning in simple two-layer networks from a discriminative learning perspective
    Hoppe, Dorothee B.
    Hendriks, Petra
    Ramscar, Michael
    van Rij, Jacolien
    BEHAVIOR RESEARCH METHODS, 2022, 54 (05) : 2221 - 2251
  • [36] ILClass: Error-driven antecedent learning for evolving Takagi-Sugeno classification systems
    Almaksour, Abdullah
    Anquetil, Eric
    APPLIED SOFT COMPUTING, 2014, 19 : 419 - 429
  • [37] Error-Driven Learning: Dopamine Signals More Than Value-Based Errors
    Keiflin, R.
    Janak, P. H.
    CURRENT BIOLOGY, 2017, 27 (24) : R1321 - R1324
  • [38] Prediction Error-Driven Memory Consolidation for Continual Learning: On the Case of Adaptive Greenhouse Models
    Schillaci, Guido
    Schmidt, Uwe
    Miranda, Luis
    KUNSTLICHE INTELLIGENZ, 2021, 35 (01): : 71 - 80
  • [39] View-invariance learning in object recognition by pigeons depends on error-driven associative learning processes
    Soto, Fabian A.
    Siow, Jeffrey Y. M.
    Wasserman, Edward A.
    VISION RESEARCH, 2012, 62 : 148 - 161
  • [40] Error-driven global transition in a competitive population on a network
    Choe, SC
    Johnson, NF
    Hui, PM
    PHYSICAL REVIEW E, 2004, 70 (05): : 4