Relation Knowledge Distillation by Auxiliary Learning for Object Detection

被引:0
|
作者
Wang, Hao [1 ]
Jia, Tong [1 ]
Wang, Qilong [2 ]
Zuo, Wangmeng [3 ]
机构
[1] Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Peoples R China
[2] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300072, Peoples R China
[3] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin 150001, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Task analysis; Predictive models; Object detection; Location awareness; Adaptation models; Accuracy; Head; knowledge distillation; relation information; auxiliary learning;
D O I
10.1109/TIP.2024.3445740
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Balancing the trade-off between accuracy and speed for obtaining higher performance without sacrificing the inference time is a challenging topic for object detection task. Knowledge distillation, which serves as a kind of model compression techniques, provides a potential and feasible way to handle above efficiency and effectiveness issue through transferring the dark knowledge from the sophisticated teacher detector to the simple student one. Despite demonstrating promising solutions to make harmonies between accuracy and speed, current knowledge distillation for object detection methods still suffer from two limitations. Firstly, most of the methods are inherited or refereed from the frameworks in image classification task, and deploy an implicit manner by imitating or constraining the features from the intermediate layers or the output predictions between the teacher and student models. While little consideration has been raised to the intrinsic relevance of the classification and localization predictions in object detection task. Besides, these methods fail to investigate the relationship between detection and distillation tasks in knowledge distillation pipeline, and they train the whole network by simply integrating losses from these two different tasks through hand-crafted designation parameters. For addressing the aforementioned issues, we propose a novel Relation Knowledge Distillation by Auxiliary Learning for Object Detection (ReAL) method in this paper. Specifically, we first design a prediction relation distillation module which makes the student model directly mimic the output predictions from the teacher one, and conduct self and mutual relation distillation losses to excavate the relation information between teacher and student models. Moreover, for better devolving into the relationship between different tasks in distillation pipeline, we introduce the auxiliary learning into knowledge distillation for object detection and develop a dynamic weight adaptation strategy. Through regarding detection task as primary task and treating distillation task as auxiliary task in auxiliary learning framework, we dynamically adjust and regularize the corresponding weights of the losses for these tasks during the training process. Experiments on MS COCO dataset are conducted using various detector combinations of teacher and student models and the results show that our proposed ReAL can achieve obvious improvement on different distillation model configurations, while performing favorably against state-of-the-arts.
引用
收藏
页码:4796 / 4810
页数:15
相关论文
共 50 条
  • [1] Dual Relation Knowledge Distillation for Object Detection
    Ni, Zhen-Liang
    Yang, Fukui
    Wen, Shengzhao
    Zhang, Gang
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1276 - 1284
  • [2] Learning Efficient Object Detection Models with Knowledge Distillation
    Chen, Guobin
    Choi, Wongun
    Yu, Xiang
    Han, Tony
    Chandraker, Manmohan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [3] Structural Knowledge Distillation for Object Detection
    de Rijk, Philip
    Schneider, Lukas
    Cordts, Marius
    Gavrila, Dariu M.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [4] Knowledge Distillation based Compact Model Learning Method for Object Detection
    Ko, Jong Gook
    Yoo, Wonyoung
    11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1276 - 1278
  • [5] Scalability of knowledge distillation in incremental deep learning for fast object detection
    Yuwono, Elizabeth Irenne
    Tjondonegoro, Dian
    Sorwar, Golam
    Alaei, Alireza
    APPLIED SOFT COMPUTING, 2022, 129
  • [6] Relation Distillation Networks for Video Object Detection
    Deng, Jiajun
    Pan, Yingwei
    Yao, Ting
    Zhou, Wengang
    Li, Houqiang
    Mei, Tao
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 7022 - 7031
  • [7] Incremental Deep Learning Method for Object Detection Model Based on Knowledge Distillation
    Fang W.
    Chen A.
    Meng N.
    Cheng H.
    Wang Q.
    Gongcheng Kexue Yu Jishu/Advanced Engineering Sciences, 2022, 54 (06): : 59 - 66
  • [8] One-stage object detection knowledge distillation via adversarial learning
    Na Dong
    Yongqiang Zhang
    Mingli Ding
    Shibiao Xu
    Yancheng Bai
    Applied Intelligence, 2022, 52 : 4582 - 4598
  • [9] One-stage object detection knowledge distillation via adversarial learning
    Dong, Na
    Zhang, Yongqiang
    Ding, Mingli
    Xu, Shibiao
    Bai, Yancheng
    APPLIED INTELLIGENCE, 2022, 52 (04) : 4582 - 4598
  • [10] New Knowledge Distillation for Incremental Object Detection
    Chen, Li
    Yu, Chunyan
    Chen, Lvcai
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,