Self-supervised efficient sample weighting for multi-exit networks

被引:0
|
作者
Liu, Kai [1 ]
Moon, Seungbin [1 ]
机构
[1] Sejong Univ, Dept Comp Engn, Seoul 05006, South Korea
关键词
Sample weighting; Early-exit networks; Self-supervised; Efficient training;
D O I
10.1016/j.knosys.2023.111003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dynamic sample weighting is an effective approach for improving the inference accuracy and efficiency of different classifiers in multi-exit networks. By bringing the early-exit behavior from testing to training, such a method allows input samples to contribute differently at each sub-network, reducing the gap between training and inference. Recent methods mainly rely on learning a weighting mechanism through reinforcement learning and meta-learning based frameworks jointly with the multi-exit models. However, additional data are usually needed and essential for these approaches, which results in more computation consumption and limits their general applicability. To address these two problems, we propose a Self-supervised Efficient Sample Weighting (SESW) method to predict weights for input samples based on their losses at each exit. We pose the problem as a self-supervised multi-class classification problem. The SESW module is trained to predict if the log-likelihood for the loss of each sample in the current training stage belongs to the most confident classifier. The results of our experiments demonstrate that the proposed method performs comparably to state-of-the-art methods in terms of prediction accuracy and dynamic inference efficiency while achieving a significant improvement in training time.(c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] Self-supervised hamiltonian mechanics neural networks
    Zhan, Youqiu
    2021 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS AND COMPUTER ENGINEERING (ICCECE), 2021, : 271 - 275
  • [22] Slimmable Networks for Contrastive Self-supervised Learning
    Zhao, Shuai
    Zhu, Linchao
    Wang, Xiaohan
    Yang, Yi
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2025, 133 (03) : 1222 - 1237
  • [23] Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning
    Jin, Ming
    Zheng, Yizhen
    Li, Yuan-Fang
    Gong, Chen
    Zhou, Chuan
    Pan, Shirui
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 1477 - 1483
  • [24] Self-Supervised Graph Attention Networks for Deep Weighted Multi-View Clustering
    Huang, Zongmo
    Ren, Yazhou
    Pu, Xiaorong
    Huang, Shudong
    Xu, Zenglin
    He, Lifang
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 7, 2023, : 7936 - 7943
  • [25] Self-Supervised Multi-Category Counting Networks for Automatic Check-Out
    Chen, Hao
    Zhou, Yangzhun
    Li, Jun
    Wei, Xiu-Shen
    Xiao, Liang
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 31 : 3004 - 3016
  • [26] MSVQ: Self-supervised learning with multiple sample views and queues
    Peng, Chen
    Long, Xianzhong
    Li, Yun
    KNOWLEDGE-BASED SYSTEMS, 2024, 283
  • [27] Self-supervised learning for efficient seismic facies classification
    Chikhaoui, Khalil
    Alfarraj, Motaz
    GEOPHYSICS, 2024, 89 (05) : IM61 - IM76
  • [28] Self-Supervised Neural Aggregation Networks for Human Parsing
    Zhao, Jian
    Li, Jianshu
    Nie, Xuecheng
    Zhao, Fang
    Chen, Yunpeng
    Wang, Zhecan
    Feng, Jiashi
    Yan, Shuicheng
    2017 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2017, : 1595 - 1603
  • [29] Embedding Imputation With Self-Supervised Graph Neural Networks
    Varolgunes, Uras
    Yao, Shibo
    Ma, Yao
    Yu, Dantong
    IEEE ACCESS, 2023, 11 : 70610 - 70620
  • [30] Exploring DeshuffleGANs in Self-Supervised Generative Adversarial Networks
    Baykal, Gulcin
    Ozcelik, Furkan
    Unal, Gozde
    PATTERN RECOGNITION, 2022, 122