Self-paced multi-label co-training

被引:4
|
作者
Gong, Yanlu [1 ]
Wu, Quanwang [1 ]
Zhou, Mengchu [2 ]
Wen, Junhao [3 ]
机构
[1] Chongqing Univ, Coll Comp Sci, Chongqing 400030, Peoples R China
[2] New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
[3] Chongqing Univ, Sch Big Data & Software Engn, Chongqing 401331, Peoples R China
基金
中国国家自然科学基金;
关键词
Co-training; Label rectification; Multi-label classification; Self-paced learning; Semi-supervised learning; LEARNING APPROACH; QUALITY; MODEL;
D O I
10.1016/j.ins.2022.11.153
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multi-label learning aims to solve classification problems where instances are associated with a set of labels. In reality, it is generally easy to acquire unlabeled data but expensive or time-consuming to label them, and this situation becomes more serious in multi-label learning as an instance needs to be annotated with several labels. Hence, semi -supervised multi-label learning approaches emerge as they are able to exploit unlabeled data to help train predictive models. This work proposes a novel approach called Self -paced Multi-label Co-Training (SMCT). It leverages the well-known co-training paradigm to iteratively train two classifiers on two views of a dataset and communicate one classi-fier's predictions on unlabeled data to augment the other's training set. As pseudo labels may be false in iterative training, self-paced learning is integrated into SMCT to rectify false pseudo labels and avoid error accumulation. Concretely, the multi-label co-training model in SMCT is formulated as an optimization problem by introducing latent weight variables of unlabeled instances. It is then solved via an alternative convex optimization algorithm. Experimental evaluations are carried out based on six benchmark multi-label datasets and three metrics. The results demonstrate that SMCT is very competitive in each setting when compared with five state-of-the-art methods.(c) 2022 Elsevier Inc. All rights reserved.
引用
收藏
页码:269 / 281
页数:13
相关论文
共 50 条
  • [21] Self-paced multi-task clustering
    Ren, Yazhou
    Que, Xiaofan
    Yao, Dezhong
    Xu, Zenglin
    NEUROCOMPUTING, 2019, 350 : 212 - 220
  • [22] Self-Paced Multi-Task Learning
    Li, Changsheng
    Yan, Junchi
    Wei, Fan
    Dong, Weishan
    Liu, Qingshan
    Zha, Hongyuan
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2175 - 2181
  • [23] Multi-Objective Self-Paced Learning
    Li, Hao
    Gong, Maoguo
    Meng, Deyu
    Miao, Qiguang
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1802 - 1808
  • [24] A Self-Paced Regularization Framework for Partial-Label Learning
    Lyu, Gengyu
    Feng, Songhe
    Wang, Tao
    Lang, Congyan
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (02) : 899 - 911
  • [25] Reducing Computer Anxiety in Self-Paced Technology Training
    Gupta, Saurabh
    PROCEEDINGS OF THE 50TH ANNUAL HAWAII INTERNATIONAL CONFERENCE ON SYSTEM SCIENCES, 2017, : 154 - 163
  • [26] Self-paced Robust Deep Face Recognition with Label Noise
    Zhu, Pengfei
    Ma, Wenya
    Hu, Qinghua
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2019, PT III, 2019, 11441 : 425 - 435
  • [27] Partial Label Learning via Self-Paced Curriculum Strategy
    Lyu, Gengyu
    Feng, Songhe
    Jin, Yi
    Li, Yidong
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT II, 2021, 12458 : 489 - 505
  • [28] Self-paced data augmentation for training neural networks
    Takase, Tomoumi
    Karakida, Ryo
    Asoh, Hideki
    NEUROCOMPUTING, 2021, 442 : 296 - 306
  • [29] Self-Paced AutoEncoder
    Yu, Tingzhao
    Guo, Chaoxu
    Wang, Lingfeng
    Xiang, Shiming
    Pan, Chunhong
    IEEE SIGNAL PROCESSING LETTERS, 2018, 25 (07) : 1054 - 1058
  • [30] SELF-PACED CHEMISTRY
    HAWKES, SJ
    JOURNAL OF CHEMICAL EDUCATION, 1984, 61 (06) : 564 - 565