Prediction-Guided Distillation for Dense Object Detection

被引:12
|
作者
Yang, Chenhongyi [1 ]
Ochal, Mateusz [2 ,3 ]
Storkey, Amos [2 ]
Crowley, Elliot J. [1 ]
机构
[1] Univ Edinburgh, Sch Engn, Edinburgh, Midlothian, Scotland
[2] Univ Edinburgh, Sch Informat, Edinburgh, Midlothian, Scotland
[3] Heriot Watt Univ, Sch Engn & Phys Sci, Edinburgh, Midlothian, Scotland
来源
基金
英国工程与自然科学研究理事会;
关键词
Dense object detection; Knowledge distillation;
D O I
10.1007/978-3-031-20077-9_8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Real-world object detection models should be cheap and accurate. Knowledge distillation (KD) can boost the accuracy of a small, cheap detection model by leveraging useful information from a larger teacher model. However, a key challenge is identifying the most informative features produced by the teacher for distillation. In this work, we show that only a very small fraction of features within a groundtruth bounding box are responsible for a teacher's high detection performance. Based on this, we propose Prediction-Guided Distillation (PGD), which focuses distillation on these key predictive regions of the teacher and yields considerable gains in performance over many existing KD baselines. In addition, we propose an adaptive weighting scheme over the key regions to smooth out their influence and achieve even better performance. Our proposed approach outperforms current state-of-theart KD baselines on a variety of advanced one-stage detection architectures. Specifically, on the COCO dataset, our method achieves between +3.1% and +4.6% AP improvement using ResNet-101 and ResNet-50 as the teacher and student backbones, respectively. On the CrowdHuman dataset, we achieve +3.2% and +2.0% improvements in MR and AP, also using these backbones. Our code is available at https://github.com/ ChenhongyiYang/PGD.
引用
收藏
页码:123 / 138
页数:16
相关论文
共 50 条
  • [31] SnipeDet: Attention-guided pyramidal prediction kernels for generic object detection
    Chen, Suting
    Cheng, Zehua
    Zhang, Liangchen
    Zheng, Yujie
    PATTERN RECOGNITION LETTERS, 2021, 152 : 302 - 310
  • [32] Mutual Supervision for Dense Object Detection
    Gao, Ziteng
    Wang, Limin
    Wu, Gangshan
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 3621 - 3630
  • [33] Prediction-Guided Performance-Energy Trade-off with Continuous Run-Time Adaptation
    Song, Taejoon
    Lo, Daniel
    Suh, G. Edward
    ISLPED '16: PROCEEDINGS OF THE 2016 INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN, 2016, : 224 - 229
  • [34] AMD: Adaptive Masked Distillation for Object Detection
    Yang, Guang
    Tang, Yin
    Li, Jun
    Xu, Jianhua
    Wan, Xili
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [35] Improving Object Detection by Label Assignment Distillation
    Nguyen, Chuong H.
    Nguyen, Thuy C.
    Tang, Tuan N.
    Phan, Nam L. H.
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 1322 - 1331
  • [36] Dual Relation Knowledge Distillation for Object Detection
    Ni, Zhen-Liang
    Yang, Fukui
    Wen, Shengzhao
    Zhang, Gang
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1276 - 1284
  • [37] Shared Knowledge Distillation Network for Object Detection
    Guo, Zhen
    Zhang, Pengzhou
    Liang, Peng
    ELECTRONICS, 2024, 13 (08)
  • [38] Decoupled Mutual Distillation for Incremental Object Detection
    Liu, Gao-Dong
    Zhao, Wan-Lei
    Zhao, Jie
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 798 - 803
  • [39] New Knowledge Distillation for Incremental Object Detection
    Chen, Li
    Yu, Chunyan
    Chen, Lvcai
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [40] Knowledge distillation for object detection with diffusion model
    Zhang, Yi
    Long, Junzong
    Li, Chunrui
    NEUROCOMPUTING, 2025, 636