Knowledge distillation for object detection with diffusion model

被引:0
|
作者
Zhang, Yi [1 ]
Long, Junzong [1 ]
Li, Chunrui [1 ]
机构
[1] Sichuan Univ, Dept Comp Sci, Chengdu, Peoples R China
关键词
Object detection; Knowledge distillation; Diffusion model; Noise prediction;
D O I
10.1016/j.neucom.2025.130019
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation is a method that transfers information from a larger network (i.e. the teacher) to a smaller network (i.e. the student), so that the student network can inherit the strong performance of the teacher network while maintaining its computational complexity within a relatively lower range. Currently, knowledge distillation has been widely applied to object detection field to mitigate the rapid expansion of the model size. In this paper, we propose an object detector based on knowledge distillation method. Meanwhile, directly mimicking the features of the teacher often fails to achieve the desired results due to the extra noise in the feature extracted by the student, which causes significant inconsistency and may even weaken the capability of the student. To address this issue, we utilize diffusion model to remove the noise so as to narrow the gap between the features extracted by the teacher and the student, improving the performance of the student. Furthermore, we develop a noise matching module that matches noise level in the student feature during the denoising process. Extensive experiments have been conducted on COCO and Pascal VOC to validate the effectiveness of the proposed method, in which our method achieves 40.0% mAP and 81.63% mAP respectively, while maintaining a frame rate of 27.3FPS, exhibiting the superiority of our model in both accuracy and speed.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Structural Knowledge Distillation for Object Detection
    de Rijk, Philip
    Schneider, Lukas
    Cordts, Marius
    Gavrila, Dariu M.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] Knowledge Distillation based Compact Model Learning Method for Object Detection
    Ko, Jong Gook
    Yoo, Wonyoung
    11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1276 - 1278
  • [3] Dual Relation Knowledge Distillation for Object Detection
    Ni, Zhen-Liang
    Yang, Fukui
    Wen, Shengzhao
    Zhang, Gang
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1276 - 1284
  • [4] New Knowledge Distillation for Incremental Object Detection
    Chen, Li
    Yu, Chunyan
    Chen, Lvcai
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [5] Foreground separation knowledge distillation for object detection
    Li, Chao
    Liu, Rugui
    Quan, Zhe
    Hu, Pengpeng
    Sun, Jun
    PEERJ COMPUTER SCIENCE, 2024, 10
  • [6] Shared Knowledge Distillation Network for Object Detection
    Guo, Zhen
    Zhang, Pengzhou
    Liang, Peng
    ELECTRONICS, 2024, 13 (08)
  • [7] Online_XKD: An online knowledge distillation model for underwater object detection
    Chen, Xiao
    Chen, Xingwu
    Wu, Fan
    Wang, Haiyan
    Yao, Haiyang
    COMPUTERS & ELECTRICAL ENGINEERING, 2024, 119
  • [8] A Light-Weight CNN for Object Detection with Sparse Model and Knowledge Distillation
    Guo, Jing-Ming
    Yang, Jr-Sheng
    Seshathiri, Sankarasrinivasan
    Wu, Hung-Wei
    ELECTRONICS, 2022, 11 (04)
  • [9] Incremental Deep Learning Method for Object Detection Model Based on Knowledge Distillation
    Fang W.
    Chen A.
    Meng N.
    Cheng H.
    Wang Q.
    Gongcheng Kexue Yu Jishu/Advanced Engineering Sciences, 2022, 54 (06): : 59 - 66
  • [10] EXPLORING EFFECTIVE KNOWLEDGE DISTILLATION FOR TINY OBJECT DETECTION
    Liu, Haotian
    Liu, Qing
    Liu, Yang
    Liang, Yixiong
    Zhao, Guoying
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 770 - 774