A bi-level metric learning framework via self-paced learning weighting

被引:0
|
作者
Yan, Jing [1 ]
Wei, Wei [1 ]
Guo, Xinyao [1 ]
Dang, Chuangyin [2 ,3 ]
Liang, Jiye [1 ]
机构
[1] Shanxi Univ, Sch Comp & Informat Technol, Minist Educ, Key Lab Computat Intelligence & Chinese Informat P, Taiyuan, Shanxi, Peoples R China
[2] City Univ Hong Kong, Dept Syst Engn & Engn Management, Hong Kong, Peoples R China
[3] City Univ Hong Kong, Shenzhen Res Inst, Shenzhen, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Metric learning; Self -paced learning; Adaptive neighborhood; Weighting tuples;
D O I
10.1016/j.patcog.2023.109446
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distance metric learning (DML) has achieved great success in many real-world applications. However, most existing DML models characterize the quality of tuples on the tuple level while ignoring the an-chor level. Therefore, the models are less accurate to portray the quality of tuples and tend to be over -fitting when anchors are noisy samples. In this paper, we devise a bi-level metric learning framework (BMLF), which characterizes the quality of tuples more finely on both levels, enhancing the generaliza-tion performance of the DML model. Furthermore, we present an implementation of BMLF based on a self-paced learning regular term and design the corresponding optimization algorithm. By weighing tu-ples on the anchor level and training the model using tuples with higher weights preferentially, the side effect of low-quality noisy samples will be alleviated. We empirically demonstrate that the effectiveness and robustness of the proposed method outperform the state-of-the-art methods on several benchmark datasets.(c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] Self-Paced Weight Consolidation for Continual Learning
    Cong, Wei
    Cong, Yang
    Sun, Gan
    Liu, Yuyang
    Dong, Jiahua
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (04) : 2209 - 2222
  • [42] MATHEMATICA BASED PLATFORM FOR SELF-PACED LEARNING
    Zinder, Y.
    Nicorovici, N.
    Langtry, T.
    EDULEARN10: INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2010, : 323 - 330
  • [43] A Probabilistic Interpretation of Self-Paced Learning with Applications to Reinforcement Learning
    Klink, Pascal
    Abdulsamad, Hany
    Belousov, Boris
    D'Eramo, Carlo
    Peters, Jan
    Pajarinen, Joni
    JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [44] Extreme Learning Machine for Supervised Classification with Self-paced Learning
    Li Li
    Kaiyi Zhao
    Sicong Li
    Ruizhi Sun
    Saihua Cai
    Neural Processing Letters, 2020, 52 : 1723 - 1744
  • [45] Learning State Recognition in Self-Paced E-Learning
    Yu, Siyang
    Kondo, Kazuaki
    Nakamura, Yuichi
    Nakajima, Takayuki
    Dantsuji, Masatake
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2017, E100D (02) : 340 - 349
  • [46] Balanced Self-Paced Learning with Feature Corruption
    Ren, Yazhou
    Zhao, Peng
    Xu, Zenglin
    Yao, Dezhong
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2064 - 2071
  • [47] Self-Paced Learning for Neural Machine Translation
    Wan, Yu
    Yang, Baosong
    Wong, Derek F.
    Zhou, Yikai
    Chao, Lidia S.
    Zhang, Haibo
    Chen, Boxing
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1074 - 1080
  • [48] Self-Paced Multitask Learning with Shared Knowledge
    Murugesan, Keerthiram
    Carbonell, Jaime
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2522 - 2528
  • [49] Self-Paced Multi-Task Learning
    Li, Changsheng
    Yan, Junchi
    Wei, Fan
    Dong, Weishan
    Liu, Qingshan
    Zha, Hongyuan
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2175 - 2181
  • [50] Multi-Objective Self-Paced Learning
    Li, Hao
    Gong, Maoguo
    Meng, Deyu
    Miao, Qiguang
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1802 - 1808