Robust Loss Functions for Training Decision Trees with Noisy Labels

被引:0
|
作者
Wilton, Jonathan [1 ]
Ye, Nan [1 ]
机构
[1] Univ Queensland, Brisbane, Qld, Australia
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider training decision trees using noisily labeled data, focusing on loss functions that can lead to robust learning algorithms. Our contributions are threefold. First, we offer novel theoretical insights on the robustness of many existing loss functions in the context of decision tree learning. We show that some of the losses belong to a class of what we call conservative losses, and the conservative losses lead to an early stopping behavior during training and noise-tolerant predictions during testing. Second, we introduce a framework for constructing robust loss functions, called distribution losses. These losses apply percentile-based penalties based on an assumed margin distribution, and they naturally allow adapting to different noise rates via a robustness parameter. In particular, we introduce a new loss called the negative exponential loss, which leads to an efficient greedy impurity-reduction learning algorithm. Lastly, our experiments on multiple datasets and noise settings validate our theoretical insight and the effectiveness of our adaptive negative exponential loss.
引用
收藏
页码:15859 / 15867
页数:9
相关论文
共 50 条
  • [1] Learning from Noisy Complementary Labels with Robust Loss Functions
    Ishiguro, Hiroki
    Ishida, Takashi
    Sugiyama, Masashi
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) : 364 - 376
  • [2] Robust optimal classification trees under noisy labels
    Victor Blanco
    Alberto Japón
    Justo Puerto
    Advances in Data Analysis and Classification, 2022, 16 : 155 - 179
  • [3] Robust optimal classification trees under noisy labels
    Blanco, Victor
    Japon, Alberto
    Puerto, Justo
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2022, 16 (01) : 155 - 179
  • [4] Asymmetric Loss Functions for Learning with Noisy Labels
    Zhou, Xiong
    Liu, Xianming
    Jiang, Junjun
    Gao, Xin
    Ji, Xiangyang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [5] Correction to: Robust optimal classification trees under noisy labels
    Victor Blanco
    Alberto Japón
    Justo Puerto
    Advances in Data Analysis and Classification, 2022, 16 (4) : 1095 - 1095
  • [6] Robust Training for Speaker Verification against Noisy Labels
    Fang, Zhihua
    He, Liang
    Ma, Hanhan
    Guo, Xiaochen
    Li, Lin
    INTERSPEECH 2023, 2023, : 3192 - 3196
  • [7] Learning from Noisy Labels with Complementary Loss Functions
    Wang, Deng-Bao
    Wen, Yong
    Pan, Lujia
    Zhang, Min-Ling
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10111 - 10119
  • [8] Reconstructing Test Labels From Noisy Loss Functions
    Aggarwal, Abhinav
    Kasiviswanathan, Shiva Prasad
    Xu, Zekun
    Feyisetan, Oluwaseyi
    Teissier, Nathanael
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [9] PeerRank: Robust Learning to Rank With Peer Loss Over Noisy Labels
    Wu, Xin
    Liu, Qing
    Qin, Jiarui
    Yu, Yong
    IEEE ACCESS, 2022, 10 : 6830 - 6841
  • [10] Robust training for multi-view stereo networks with noisy labels
    Wang, Xiang
    Luo, Haonan
    Wang, Zihang
    Zheng, Jin
    Bai, Xiao
    DISPLAYS, 2024, 81