Self Supervision to Distillation for Long-Tailed Visual Recognition

被引:37
|
作者
Li, Tianhao [1 ]
Wang, Limin [1 ]
Wu, Gangshan [1 ]
机构
[1] Nanjing Univ, State Key Lab Novel Software Technol, Nanjing, Peoples R China
基金
中国国家自然科学基金;
关键词
SMOTE;
D O I
10.1109/ICCV48922.2021.00067
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep learning has achieved remarkable progress for visual recognition on large-scale balanced datasets but still performs poorly on real-world long-tailed data. Previous methods often adopt class re-balanced training strategies to effectively alleviate the imbalance issue, but might be a risk of over-fitting tail classes. The recent decoupling method overcomes over-fitting issues by using a multi-stage training scheme, yet, it is still incapable of capturing tail class information in the feature learning stage. In this paper, we show that soft label can serve as a powerful solution to incorporate label correlation into a multi-stage training scheme for long-tailed recognition. The intrinsic relation between classes embodied by soft labels turns out to be helpful for long-tailed recognition by transferring knowledge from head to tail classes. Specifically, we propose a conceptually simple yet particularly effective multi-stage training scheme, termed as Self Supervised to Distillation (SSD). This scheme is composed of two parts. First, we introduce a self-distillation framework for long-tailed recognition, which can mine the label relation automatically. Second, we present a new distillation label generation module guided by self-supervision. The distilled labels integrate information from both label and data domains that can model long-tailed distribution effectively. We conduct extensive experiments and our method achieves the state-of-the-art results on three long-tailed recognition benchmarks: ImageNet-LT, CIFAR100-LT and iNaturalist 2018. Our SSD outperforms the strong LWS baseline by from 2.7% to 4.5% on various datasets.
引用
收藏
页码:610 / 619
页数:10
相关论文
共 50 条
  • [21] Dynamic Learnable Logit Adjustment for Long-Tailed Visual Recognition
    Zhang, Enhao
    Geng, Chuanxing
    Li, Chaohua
    Chen, Songcan
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (09) : 7986 - 7997
  • [22] Feature Re-Balancing for Long-Tailed Visual Recognition
    Zhao, Yan
    Chen, Weicong
    Huang, Kai
    Zhu, Jihong
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [23] FCC: Feature Clusters Compression for Long-Tailed Visual Recognition
    Li, Jian
    Meng, Ziyao
    Shi, Daqian
    Song, Rui
    Diao, Xiaolei
    Wang, Jingwen
    Xu, Hao
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 24080 - 24089
  • [24] Balanced clustering contrastive learning for long-tailed visual recognition
    Kim, Byeong-il
    Ko, Byoung Chul
    PATTERN ANALYSIS AND APPLICATIONS, 2025, 28 (01)
  • [25] Feature calibration and feature separation for long-tailed visual recognition
    Wang, Qianqian
    Zhou, Fangyu
    Zhao, Xiangge
    Lin, Yangtao
    Ye, Haibo
    NEUROCOMPUTING, 2025, 637
  • [26] Adaptive Logit Adjustment Loss for Long-Tailed Visual Recognition
    Zhao, Yan
    Chen, Weicong
    Tan, Xu
    Huang, Kai
    Zhu, Jihong
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 3472 - 3480
  • [27] Hierarchical block aggregation network for long-tailed visual recognition
    Pang, Shanmin
    Wang, Weiye
    Zhang, Renzhong
    Hao, Wenyu
    NEUROCOMPUTING, 2023, 549
  • [28] MetaSAug: Meta Semantic Augmentation for Long-Tailed Visual Recognition
    Li, Shuang
    Gong, Kaixiong
    Liu, Chi Harold
    Wang, Yulin
    Qiao, Feng
    Cheng, Xinjing
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 5208 - 5217
  • [29] Dynamic prior probability network for long-tailed visual recognition
    Zhou, Xuesong
    Sun, Jiaqi
    Zhai, Junhai
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 268
  • [30] Joint weighted knowledge distillation and multi-scale feature distillation for long-tailed recognition
    Yiru He
    Shiqian Wang
    Junyang Yu
    Chaoyang Liu
    Xin He
    Han Li
    International Journal of Machine Learning and Cybernetics, 2024, 15 : 1647 - 1661