Relation Knowledge Distillation by Auxiliary Learning for Object Detection

被引:0
|
作者
Wang, Hao [1 ]
Jia, Tong [1 ]
Wang, Qilong [2 ]
Zuo, Wangmeng [3 ]
机构
[1] Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Peoples R China
[2] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300072, Peoples R China
[3] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin 150001, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Task analysis; Predictive models; Object detection; Location awareness; Adaptation models; Accuracy; Head; knowledge distillation; relation information; auxiliary learning;
D O I
10.1109/TIP.2024.3445740
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Balancing the trade-off between accuracy and speed for obtaining higher performance without sacrificing the inference time is a challenging topic for object detection task. Knowledge distillation, which serves as a kind of model compression techniques, provides a potential and feasible way to handle above efficiency and effectiveness issue through transferring the dark knowledge from the sophisticated teacher detector to the simple student one. Despite demonstrating promising solutions to make harmonies between accuracy and speed, current knowledge distillation for object detection methods still suffer from two limitations. Firstly, most of the methods are inherited or refereed from the frameworks in image classification task, and deploy an implicit manner by imitating or constraining the features from the intermediate layers or the output predictions between the teacher and student models. While little consideration has been raised to the intrinsic relevance of the classification and localization predictions in object detection task. Besides, these methods fail to investigate the relationship between detection and distillation tasks in knowledge distillation pipeline, and they train the whole network by simply integrating losses from these two different tasks through hand-crafted designation parameters. For addressing the aforementioned issues, we propose a novel Relation Knowledge Distillation by Auxiliary Learning for Object Detection (ReAL) method in this paper. Specifically, we first design a prediction relation distillation module which makes the student model directly mimic the output predictions from the teacher one, and conduct self and mutual relation distillation losses to excavate the relation information between teacher and student models. Moreover, for better devolving into the relationship between different tasks in distillation pipeline, we introduce the auxiliary learning into knowledge distillation for object detection and develop a dynamic weight adaptation strategy. Through regarding detection task as primary task and treating distillation task as auxiliary task in auxiliary learning framework, we dynamically adjust and regularize the corresponding weights of the losses for these tasks during the training process. Experiments on MS COCO dataset are conducted using various detector combinations of teacher and student models and the results show that our proposed ReAL can achieve obvious improvement on different distillation model configurations, while performing favorably against state-of-the-arts.
引用
收藏
页码:4796 / 4810
页数:15
相关论文
共 50 条
  • [21] Data-free Knowledge Distillation for Object Detection
    Chawla, Akshay
    Yin, Hongxu
    Molchanov, Pavlo
    Alvarez, Jose
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3288 - 3297
  • [22] Structured Knowledge Distillation for Accurate and Efficient Object Detection
    Zhang, Linfeng
    Ma, Kaisheng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 15706 - 15724
  • [23] Transmission Line Detection Through Auxiliary Feature Registration With Knowledge Distillation
    Wang, Yusen
    Zhou, Wujie
    Qian, Xiaohong
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2025, 22 : 9413 - 9425
  • [24] Knowledge distillation for object detection based on Inconsistency-based Feature Imitation and Global Relation Imitation
    Ju, Peng
    Zhang, Yi
    NEUROCOMPUTING, 2024, 566
  • [25] Relation-Based Knowledge Distillation for Anomaly Detection
    Cheng, Hekai
    Yang, Lu
    Liu, Zulong
    PATTERN RECOGNITION AND COMPUTER VISION, PT I, 2021, 13019 : 105 - 116
  • [26] MAKD:MULTIPLE AUXILIARY KNOWLEDGE DISTILLATION
    Chen, Zehan
    Jin, Xuan
    He, Yuan
    Xue, Hui
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4123 - 4127
  • [27] Forest Fire Object Detection Analysis Based on Knowledge Distillation
    Xie, Jinzhou
    Zhao, Hongmin
    FIRE-SWITZERLAND, 2023, 6 (12):
  • [28] Object Knowledge Distillation for Joint Detection and Tracking in Satellite Videos
    Zhang, Wenhua
    Deng, Wenjing
    Cui, Zhen
    Liu, Jia
    Jiao, Licheng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 13
  • [29] CrossKD: Cross-Head Knowledge Distillation for Object Detection
    Wang, Jiabao
    Chen, Yuming
    Zhang, Zhaohui
    Li, Xiang
    Cheng, Ming-Ming
    Hou, Qibin
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 16520 - 16530
  • [30] Knowledge Distillation via Hierarchical Matching for Small Object Detection
    Ma, Yong-Chi
    Ma, Xiao
    Hao, Tian-Ran
    Cui, Li-Sha
    Jin, Shao-Hui
    Lyu, Pei
    Journal of Computer Science and Technology, 2024, 39 (04) : 798 - 810