Three Heads Are Better than One: Complementary Experts for Long-Tailed Semi-supervised Learning

被引:0
|
作者
Ma, Chengcheng [1 ,2 ]
Elezi, Ismail [3 ]
Deng, Jiankang [3 ]
Dong, Weiming [1 ]
Xu, Changsheng [1 ]
机构
[1] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing, Peoples R China
[3] Huawei Noahs Ark Lab, London, England
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We address the challenging problem of Long-Tailed Semi-Supervised Learning (LTSSL) where labeled data exhibit imbalanced class distribution and unlabeled data follow an unknown distribution. Unlike in balanced SSL, the generated pseudo-labels are skewed towards head classes, intensifying the training bias. Such a phenomenon is even amplified as more unlabeled data will be mislabeled as head classes when the class distribution of labeled and unlabeled datasets are mismatched. To solve this problem, we propose a novel method named ComPlementary Experts (CPE). Specifically, we train multiple experts to model various class distributions, each of them yielding high-quality pseudo-labels within one form of class distribution. Besides, we introduce Classwise Batch Normalization for CPE to avoid performance degradation caused by feature distribution mismatch between head and non-head classes. CPE achieves state-of-the-art performances on CIFAR-10-LT, CIFAR-100-LT, and STL-10-LT dataset benchmarks. For instance, on CIFAR-10-LT, CPE improves test accuracy by over >2.22% compared to baselines. Code is available at https://github.com/machengcheng2016/CPE-LTSSL.
引用
收藏
页码:14229 / 14237
页数:9
相关论文
共 50 条
  • [41] Self-Supervised Skill Learning for Semi-Supervised Long-Horizon Instruction Following
    Zhuang, Benhui
    Zhang, Chunhong
    Hu, Zheng
    ELECTRONICS, 2023, 12 (07)
  • [42] Multiple Heads are Better than One: Few-shot Font Generation with Multiple Localized Experts
    Park, Song
    Chun, Sanghyuk
    Cha, Junbum
    Lee, Bado
    Shim, Hyunjung
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 13880 - 13889
  • [43] FedTriNet: A Pseudo Labeling Method with Three Players for Federated Semi-supervised Learning
    Che, Liwei
    Long, Zewei
    Wang, Jiaqi
    Wang, Yaqing
    Xiao, Houping
    Ma, Fenglong
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 715 - 724
  • [44] Three-Way and Semi-supervised Decision Tree Learning Based on Orthopartitions
    Campagner, Andrea
    Ciucci, Davide
    INFORMATION PROCESSING AND MANAGEMENT OF UNCERTAINTY IN KNOWLEDGE-BASED SYSTEMS: THEORY AND FOUNDATIONS, PT II, 2018, 854 : 748 - 759
  • [45] DetMatch: Two Teachers are Better than One for Joint 2D and 3D Semi-Supervised Object Detection
    Park, Jinhyung
    Xu, Chenfeng
    Zhou, Yiyang
    Tomizuka, Masayoshi
    Zhan, Wei
    COMPUTER VISION, ECCV 2022, PT X, 2022, 13670 : 370 - 389
  • [46] Complementary consistency semi-supervised learning for 3D left atrial image segmentation
    Huang, Hejun
    Chen, Zuguo
    Chen, Chaoyang
    Lu, Ming
    Zou, Ying
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 165
  • [47] The Effect of Semi-supervised Learning on Parsing Long Distance Dependencies in German and Swedish
    Sogaard, Anders
    Rishoj, Christian
    ADVANCES IN NATURAL LANGUAGE PROCESSING, 2010, 6233 : 406 - 417
  • [49] Two Heads Are Better Than One (and Three Are Better Than Two): Challenging the Individualist Ethos of the Educator-Hero Film
    Benton, Steve
    JOURNAL OF POPULAR FILM AND TELEVISION, 2013, 41 (02) : 98 - 108
  • [50] When two heads are better than one: nutritional epidemiology meets machine learning
    Krishnan, Sridevi
    Ramyaa, Ramyaa
    AMERICAN JOURNAL OF CLINICAL NUTRITION, 2020, 111 (06): : 1124 - 1126