Closed -loop unified knowledge distillation for dense object detection

被引:4
|
作者
Song, Yaoye [1 ,2 ]
Zhang, Peng [1 ,2 ]
Huang, Wei [3 ]
Zha, Yufei [1 ,2 ]
You, Tao [1 ]
Zhang, Yanning [1 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian, Shaanxi, Peoples R China
[2] Northwestern Polytech Univ, Ningbo Inst, Xian, Peoples R China
[3] Nanchang Univ, Sch Math & Comp Sci, Nanchang, Peoples R China
基金
中国国家自然科学基金;
关键词
Triple parallel distillation; Hierarchical re-weighting attention distillation; Dense object detection; Closed-loop unified;
D O I
10.1016/j.patcog.2023.110235
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most of knowledge distillation methods for object detection are feature -based and have achieved competitive results. However, only distillating in feature imitation part does not take full advantage of more sophisticated detection head design for object detection, especially dense object detection. In this paper, a triple parallel distillation (TPD) is proposed which can efficiently transfer all the output response in detection head from teacher to student. Moreover, to overcome the drawback of simply combining the feature -based with the response -based distillation with limited effect enhancement. A hierarchical re -weighting attention distillation (HRAD) is proposed to make student learn more than the teacher in feature information, as well as reciprocal feedback between the classification-IoU joint representation of detection head and the attention -based feature. By jointly interacting the benefits of TPD and HRAD, a closed -loop unified knowledge distillation for dense object detection is proposed, which makes the feature -based and response -based distillation unified and complementary. Experiments on different benchmark datasets have shown that the proposed work is able to outperform other state-of-the-art distillation methods for dense object detection on both accuracy and robustness.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection
    Yang, Longrong
    Zhou, Xianpan
    Li, Xuewei
    Qiao, Liang
    Li, Zheyang
    Yang, Ziwei
    Wang, Gaoang
    Li, Xi
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 17129 - 17138
  • [32] Unified Unsupervised Salient Object Detection via Knowledge Transfer
    Yuan, Yao
    Liu, Wutao
    Gao, Pan
    Dai, Qun
    Qin, Jie
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 1616 - 1624
  • [33] Dense Tiny Object Detection: A Scene Context Guided Approach and a Unified Benchmark
    Zhao, Zhicheng
    Du, Jiaxin
    Li, Chenglong
    Fang, Xiang
    Xiao, Yun
    Tang, Jin
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 13
  • [34] Structured Knowledge Distillation for Dense Prediction
    Liu, Yifan
    Shu, Changyong
    Wang, Jingdong
    Shen, Chunhua
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (06) : 7035 - 7049
  • [35] KDSMALL: A lightweight small object detection algorithm based on knowledge distillation
    Zhou, Wen
    Wang, Xiaodon
    Fan, Yusheng
    Yang, Yishuai
    Wen, Yihan
    Li, Yixuan
    Xu, Yicheng
    Lin, Zhengyuan
    Chen, Langlang
    Yao, Shizhou
    Zequn, Liu
    Wang, Jianqing
    COMPUTER COMMUNICATIONS, 2024, 219 : 271 - 281
  • [36] Knowledge Distillation based Compact Model Learning Method for Object Detection
    Ko, Jong Gook
    Yoo, Wonyoung
    11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1276 - 1278
  • [37] ROBUST AND ACCURATE OBJECT DETECTION VIA SELF-KNOWLEDGE DISTILLATION
    Xu, Weipeng
    Chu, Pengzhi
    Xie, Renhao
    Xiao, Xiongziyan
    Huang, Hongcheng
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 91 - 95
  • [38] Revisiting knowledge distillation for light-weight visual object detection
    Gao, Tianze
    Gao, Yunfeng
    Li, Yu
    Qin, Peiyuan
    TRANSACTIONS OF THE INSTITUTE OF MEASUREMENT AND CONTROL, 2021, 43 (13) : 2888 - 2898
  • [39] GAN-Knowledge Distillation for One-Stage Object Detection
    Wang, Wanwei
    Hong, Wei
    Wang, Feng
    Yu, Jinke
    IEEE ACCESS, 2020, 8 : 60719 - 60727
  • [40] Knowledge Distillation in Object Detection for Resource-Constrained Edge Computing
    Setyanto, Arief
    Sasongko, Theopilus Bayu
    Fikri, Muhammad Ainul
    Ariatmanto, Dhani
    Agastya, I. Made Artha
    Rachmanto, Rakandhiya Daanii
    Ardana, Affan
    Kim, In Kee
    IEEE ACCESS, 2025, 13 : 18200 - 18214