A new self-paced method for multiple instance boosting learning

被引:8
|
作者
Xiao, Yanshan [1 ]
Yang, Xiaozhou [1 ]
Liu, Bo [2 ]
机构
[1] Guangdong Univ Technol, Sch Comp, Guangzhou, Peoples R China
[2] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
关键词
Multiple instance learning; Multiple instance boost learning; Self-Paced learning;
D O I
10.1016/j.ins.2019.12.015
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multi-instance learning is a useful tool for solving label ambiguity. MILBoost is one of the algorithms, which uses boosting method to handle the multiple instance learning problems. Although MILBoosting has achieved good effect on multiple instance learning, little work has been done on the problem of multiple instance learning where a small number of bags are labeled. In this paper, we propose a new approach by incorporating the SPL and boosting into the procedure of multiple instance learning, called Self-Paced Boost Multiple Instance Learning (SP-B-MIL). The proposed approach can improve the effectiveness and robustness of multi-instance learning when a small number of bags are labeled. We first reformulate the multiple instance boosting model with a self-paced loss formulation. Then we propose a self-paced function for realizing desired self-paced scheme, which makes it possible to select instances from different bags during each iteration. Finally, we design a simple and effective algorithm to solve the optimization problem. Experimental results show that the proposed algorithm is comparable to the classical algorithms in some multi-instance learning benchmark data sets. (C) 2019 Elsevier Inc. All rights reserved.
引用
收藏
页码:80 / 90
页数:11
相关论文
共 50 条
  • [31] Self-paced deep clustering with learning loss
    Zhang, Kai
    Song, Chengyun
    Qiu, Lianpeng
    PATTERN RECOGNITION LETTERS, 2023, 171 : 8 - 14
  • [32] Self-Paced Weight Consolidation for Continual Learning
    Cong, Wei
    Cong, Yang
    Sun, Gan
    Liu, Yuyang
    Dong, Jiahua
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (04) : 2209 - 2222
  • [33] MATHEMATICA BASED PLATFORM FOR SELF-PACED LEARNING
    Zinder, Y.
    Nicorovici, N.
    Langtry, T.
    EDULEARN10: INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2010, : 323 - 330
  • [34] Contextualization of Learning Objects for Self-Paced Learning Environments
    Bodendorf, Freimut
    Goetzelt, Kai-Uwe
    PROCEEDINGS OF THE SIXTH INTERNATIONAL CONFERENCE ON SYSTEMS (ICONS 2011), 2011, : 157 - 160
  • [35] Balanced Self-Paced Learning with Feature Corruption
    Ren, Yazhou
    Zhao, Peng
    Xu, Zenglin
    Yao, Dezhong
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2064 - 2071
  • [36] Self-Paced Learning for Neural Machine Translation
    Wan, Yu
    Yang, Baosong
    Wong, Derek F.
    Zhou, Yikai
    Chao, Lidia S.
    Zhang, Haibo
    Chen, Boxing
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1074 - 1080
  • [37] Self-Paced Multitask Learning with Shared Knowledge
    Murugesan, Keerthiram
    Carbonell, Jaime
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2522 - 2528
  • [38] Self-Paced Multi-Task Learning
    Li, Changsheng
    Yan, Junchi
    Wei, Fan
    Dong, Weishan
    Liu, Qingshan
    Zha, Hongyuan
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2175 - 2181
  • [39] A Self-Paced Regularization Framework for Multilabel Learning
    Li, Changsheng
    Wei, Fan
    Yan, Junchi
    Zhang, Xiaoyu
    Liu, Qingshan
    Zha, Hongyuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (06) : 2660 - 2666
  • [40] Multi-Objective Self-Paced Learning
    Li, Hao
    Gong, Maoguo
    Meng, Deyu
    Miao, Qiguang
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1802 - 1808