Class-Wise Denoising for Robust Learning Under Label Noise

被引:11
|
作者
Gong, Chen [1 ]
Ding, Yongliang [1 ]
Han, Bo [2 ]
Niu, Gang [3 ]
Yang, Jian [4 ]
You, Jane [5 ]
Tao, Dacheng [6 ,7 ]
Sugiyama, Masashi [3 ,8 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Key Lab Intelligent Percept & Syst High Dimens Inf, Minist Educ,Jiangsu Key Lab Image & Video Understa, Nanjing 210094, Jiangsu, Peoples R China
[2] Hong Kong Baptist Univ, Dept Comp Sci, Hong Kong, Peoples R China
[3] RIKEN Ctr Adv Intelligence Project, Tokyo 1030027, Japan
[4] Nankai Univ, Coll Comp Sci, Tianjin 300071, Peoples R China
[5] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Peoples R China
[6] JD Explore Acad, Beijing 101100, Peoples R China
[7] Univ Sydney, Sydney, NSW 2006, Australia
[8] Univ Tokyo, Grad Sch Frontier Sci, Chiba 1138654, Japan
关键词
Noise measurement; Training; Entropy; Estimation; Neural networks; Matrix decomposition; Fasteners; Label noise; centroid estimation; unbiasedness; variance reduction;
D O I
10.1109/TPAMI.2022.3178690
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Label noise is ubiquitous in many real-world scenarios which often misleads training algorithm and brings about the degraded classification performance. Therefore, many approaches have been proposed to correct the loss function given corrupted labels to combat such label noise. Among them, a trend of works achieve this goal by unbiasedly estimating the data centroid, which plays an important role in constructing an unbiased risk estimator for minimization. However, they usually handle the noisy labels in different classes all at once, so the local information inherited by each class is ignored which often leads to unsatisfactory performance. To address this defect, this paper presents a novel robust learning algorithm dubbed "Class-Wise Denoising" (CWD), which tackles the noisy labels in a class-wise way to ease the entire noise correction task. Specifically, two virtual auxiliary sets are respectively constructed by presuming that the positive and negative labels in the training set are clean, so the original false-negative labels and false-positive ones are tackled separately. As a result, an improved centroid estimator can be designed which helps to yield more accurate risk estimator. Theoretically, we prove that: 1) the variance in centroid estimation can often be reduced by our CWD when compared with existing methods with unbiased centroid estimator; and 2) the performance of CWD trained on the noisy set will converge to that of the optimal classifier trained on the clean set with a convergence rate O(1/vn) )where n is the number of the training examples. These sound theoretical properties critically enable our CWD to produce the improved classification performance under label noise, which is also demonstrated by the comparisons with ten representative state-of-the-art methods on a variety of benchmark datasets.
引用
收藏
页码:2835 / 2848
页数:14
相关论文
共 50 条
  • [1] Class-wise Deep Dictionary Learning
    Singhal, Vanika
    Khurana, Prerna
    Majumdar, Angshul
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 1125 - 1132
  • [2] Class-wise Thresholding for Robust Out-of-Distribution Detection
    Guarrera, Matteo
    Jin, Baihong
    Lin, Tung-Wei
    Zuluaga, Maria A.
    Chen, Yuxin
    Sangiovanni-Vincentelli, Alberto
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 2836 - 2845
  • [3] Class-wise dictionary learning for hyperspectral image classification
    Hao, Siyuan
    Wang, Wei
    Yan, Yan
    Bruzzone, Lorenzo
    NEUROCOMPUTING, 2017, 220 : 121 - 129
  • [4] A Novel Class-wise Forgetting Detector in Continual Learning
    Pham, Xuan Cuong
    Liew, Alan Wee-chung
    Wang, Can
    2021 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA 2021), 2021, : 518 - 525
  • [5] Unsupervised Domain Adaptation Using Robust Class-Wise Matching
    Zhang, Lei
    Wang, Peng
    Wei, Wei
    Lu, Hao
    Shen, Chunhua
    van den Hengel, Anton
    Zhang, Yanning
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2019, 29 (05) : 1339 - 1349
  • [6] ClassTer: Mobile Shift-Robust Personalized Federated Learning via Class-Wise Clustering
    Li, Xiaochen
    Liu, Sicong
    Zhou, Zimu
    Xu, Yuan
    Guo, Bin
    Yu, Zhiwen
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2025, 24 (03) : 2014 - 2028
  • [7] Class-wise Information Gain
    Zhang, Pengtao
    Tan, Ying
    2013 INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY (ICIST), 2013, : 972 - 978
  • [8] Class-Wise Contrastive Prototype Learning for Semi-Supervised Classification Under Intersectional Class Mismatch
    Li, Mingyu
    Zhou, Tao
    Han, Bo
    Liu, Tongliang
    Liang, Xinkai
    Zhao, Jiajia
    Gong, Chen
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 8145 - 8156
  • [9] CCL: CLASS-WISE CURRICULUM LEARNING FOR CLASS IMBALANCE PROBLEMS.
    Escudero-Vinolo, Marcos
    Lopez-Cifuentes, Alejandro
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 1476 - 1480
  • [10] Deep Class-Wise Hashing: Semantics-Preserving Hashing via Class-Wise Loss
    Zhe, Xuefei
    Chen, Shifeng
    Yan, Hong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (05) : 1681 - 1695