Learning Noise Transition Matrix from Only Noisy Labels via Total Variation Regularization

被引:0
|
作者
Zhang, Yivan [1 ,2 ]
Niu, Gang [2 ]
Sugiyama, Masashi [1 ,2 ]
机构
[1] Univ Tokyo, Tokyo, Japan
[2] RIKEN AIP, Tokyo, Japan
关键词
CLASSIFICATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many weakly supervised classification methods employ a noise transition matrix to capture the class-conditional label corruption. To estimate the transition matrix from noisy data, existing methods often need to estimate the noisy class-posterior, which could be unreliable due to the overconfidence of neural networks. In this work, we propose a theoretically grounded method that can estimate the noise transition matrix and learn a classifier simultaneously, without relying on the error-prone noisy class-posterior estimation. Concretely, inspired by the characteristics of the stochastic label corruption process, we propose total variation regularization, which encourages the predicted probabilities to be more distinguishable from each other. Under mild assumptions, the proposed method yields a consistent estimator of the transition matrix. We show the effectiveness of the proposed method through experiments on benchmark and real-world datasets.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Learning with Noisy Labels via Sparse Regularization
    Zhou, Xiong
    Liu, Xianming
    Wang, Chenyang
    Zhai, Deming
    Jiang, Junjun
    Ji, Xiangyang
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 72 - 81
  • [2] MATRIX SMOOTHING: A REGULARIZATION FOR DNN WITH TRANSITION MATRIX UNDER NOISY LABELS
    Lv, Xianbin
    Wu, Dongxian
    Xia, Shu-Tao
    2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2020,
  • [3] Learning Deep Networks from Noisy Labels with Dropout Regularization
    Jindal, Ishan
    Nokleby, Matthew
    Chen, Xuewen
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 967 - 972
  • [4] Class-Independent Regularization for Learning with Noisy Labels
    Yi, Rumeng
    Guan, Dayan
    Huang, Yaping
    Lu, Shijian
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 3, 2023, : 3276 - 3284
  • [5] Consistency Regularization on Clean Samples for Learning with Noisy Labels
    Nomura, Yuichiro
    Kurita, Takio
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) : 387 - 395
  • [6] Towards Federated Learning against Noisy Labels via Local Self-Regularization
    Jiang, Xuefeng
    Sun, Sheng
    Wang, Yuwei
    Liu, Min
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 862 - 873
  • [7] Learning with Noisy Labels by Efficient Transition Matrix Estimation to Combat Label Miscorrection
    Kye, Seong Min
    Choi, Kwanghee
    Yi, Joonyoung
    Chang, Buru
    COMPUTER VISION, ECCV 2022, PT XXV, 2022, 13685 : 717 - 738
  • [8] Subclass consistency regularization for learning with noisy labels based on contrastive learning
    Sun, Xinkai
    Zhang, Sanguo
    NEUROCOMPUTING, 2025, 614
  • [9] Efficient In SAR phase noise reduction via total variation regularization
    LUO XiaoMei
    WANG XiangFeng
    SUO ZhiYong
    LI ZhenFang
    Science China(Information Sciences), 2015, 58 (08) : 64 - 76
  • [10] Learning From Noisy Labels via Dynamic Loss Thresholding
    Yang, Hao
    Jin, You-Zhi
    Li, Zi-Yin
    Wang, Deng-Bao
    Geng, Xin
    Zhang, Min-Ling
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 6503 - 6516