Closed -loop unified knowledge distillation for dense object detection

被引:4
|
作者
Song, Yaoye [1 ,2 ]
Zhang, Peng [1 ,2 ]
Huang, Wei [3 ]
Zha, Yufei [1 ,2 ]
You, Tao [1 ]
Zhang, Yanning [1 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian, Shaanxi, Peoples R China
[2] Northwestern Polytech Univ, Ningbo Inst, Xian, Peoples R China
[3] Nanchang Univ, Sch Math & Comp Sci, Nanchang, Peoples R China
基金
中国国家自然科学基金;
关键词
Triple parallel distillation; Hierarchical re-weighting attention distillation; Dense object detection; Closed-loop unified;
D O I
10.1016/j.patcog.2023.110235
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most of knowledge distillation methods for object detection are feature -based and have achieved competitive results. However, only distillating in feature imitation part does not take full advantage of more sophisticated detection head design for object detection, especially dense object detection. In this paper, a triple parallel distillation (TPD) is proposed which can efficiently transfer all the output response in detection head from teacher to student. Moreover, to overcome the drawback of simply combining the feature -based with the response -based distillation with limited effect enhancement. A hierarchical re -weighting attention distillation (HRAD) is proposed to make student learn more than the teacher in feature information, as well as reciprocal feedback between the classification-IoU joint representation of detection head and the attention -based feature. By jointly interacting the benefits of TPD and HRAD, a closed -loop unified knowledge distillation for dense object detection is proposed, which makes the feature -based and response -based distillation unified and complementary. Experiments on different benchmark datasets have shown that the proposed work is able to outperform other state-of-the-art distillation methods for dense object detection on both accuracy and robustness.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Localization Distillation for Dense Object Detection
    Zheng, Zhaohui
    Ye, Rongguang
    Wang, Ping
    Ren, Dongwei
    Zuo, Wangmeng
    Hou, Qibin
    Cheng, Ming-Ming
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9397 - 9406
  • [2] Structural Knowledge Distillation for Object Detection
    de Rijk, Philip
    Schneider, Lukas
    Cordts, Marius
    Gavrila, Dariu M.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [3] Prediction-Guided Distillation for Dense Object Detection
    Yang, Chenhongyi
    Ochal, Mateusz
    Storkey, Amos
    Crowley, Elliot J.
    COMPUTER VISION, ECCV 2022, PT IX, 2022, 13669 : 123 - 138
  • [4] Parallel-Circuitized' distillation for dense object detection
    Song, Yaoye
    Zhang, Peng
    Huang, Wei
    Zha, Yufei
    You, Tao
    Zhang, Yanning
    DISPLAYS, 2024, 81
  • [5] Dual Relation Knowledge Distillation for Object Detection
    Ni, Zhen-Liang
    Yang, Fukui
    Wen, Shengzhao
    Zhang, Gang
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1276 - 1284
  • [6] New Knowledge Distillation for Incremental Object Detection
    Chen, Li
    Yu, Chunyan
    Chen, Lvcai
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [7] Knowledge distillation for object detection with diffusion model
    Zhang, Yi
    Long, Junzong
    Li, Chunrui
    NEUROCOMPUTING, 2025, 636
  • [8] Foreground separation knowledge distillation for object detection
    Li, Chao
    Liu, Rugui
    Quan, Zhe
    Hu, Pengpeng
    Sun, Jun
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [9] Shared Knowledge Distillation Network for Object Detection
    Guo, Zhen
    Zhang, Pengzhou
    Liang, Peng
    ELECTRONICS, 2024, 13 (08)
  • [10] EXPLORING EFFECTIVE KNOWLEDGE DISTILLATION FOR TINY OBJECT DETECTION
    Liu, Haotian
    Liu, Qing
    Liu, Yang
    Liang, Yixiong
    Zhao, Guoying
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 770 - 774