Knowledge distillation for object detection with diffusion model

被引:0
|
作者
Zhang, Yi [1 ]
Long, Junzong [1 ]
Li, Chunrui [1 ]
机构
[1] Sichuan Univ, Dept Comp Sci, Chengdu, Peoples R China
关键词
Object detection; Knowledge distillation; Diffusion model; Noise prediction;
D O I
10.1016/j.neucom.2025.130019
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation is a method that transfers information from a larger network (i.e. the teacher) to a smaller network (i.e. the student), so that the student network can inherit the strong performance of the teacher network while maintaining its computational complexity within a relatively lower range. Currently, knowledge distillation has been widely applied to object detection field to mitigate the rapid expansion of the model size. In this paper, we propose an object detector based on knowledge distillation method. Meanwhile, directly mimicking the features of the teacher often fails to achieve the desired results due to the extra noise in the feature extracted by the student, which causes significant inconsistency and may even weaken the capability of the student. To address this issue, we utilize diffusion model to remove the noise so as to narrow the gap between the features extracted by the teacher and the student, improving the performance of the student. Furthermore, we develop a noise matching module that matches noise level in the student feature during the denoising process. Extensive experiments have been conducted on COCO and Pascal VOC to validate the effectiveness of the proposed method, in which our method achieves 40.0% mAP and 81.63% mAP respectively, while maintaining a frame rate of 27.3FPS, exhibiting the superiority of our model in both accuracy and speed.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Dual model knowledge distillation for industrial anomaly detection
    Thomine, Simon
    Snoussi, Hichem
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (03)
  • [42] Localization Distillation for Object Detection
    Zheng, Zhaohui
    Ye, Rongguang
    Hou, Qibin
    Ren, Dongwei
    Wang, Ping
    Zuo, Wangmeng
    Cheng, Ming-Ming
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 10070 - 10083
  • [43] A light-weight object detection method based on knowledge distillation and model pruning for seam tracking system
    Zou, Yanbiao
    Liu, Chunyuan
    MEASUREMENT, 2023, 220
  • [44] CLASS-WISE FM-NMS FOR KNOWLEDGE DISTILLATION OF OBJECT DETECTION
    Liu, Lyuzhuang
    Hirakawa, Tsubasa
    Yamashita, Takayoshi
    Fujiyoshi, Hironobu
    2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 1641 - 1645
  • [45] Directional Alignment Instance Knowledge Distillation for Arbitrary-Oriented Object Detection
    Wang, Ao
    Wang, Hao
    Huang, Zhanchao
    Zhao, Boya
    Li, Wei
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [46] Squeezed Deep 6DoF Object Detection using Knowledge Distillation
    Felix, Heitor
    Rodrigues, Walber M.
    Macedo, David
    Simoes, Francisco
    Oliveira, Adriano L., I
    Teichrieb, Veronica
    Zanchettin, Cleber
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [47] One-stage object detection knowledge distillation via adversarial learning
    Na Dong
    Yongqiang Zhang
    Mingli Ding
    Shibiao Xu
    Yancheng Bai
    Applied Intelligence, 2022, 52 : 4582 - 4598
  • [48] One-stage object detection knowledge distillation via adversarial learning
    Dong, Na
    Zhang, Yongqiang
    Ding, Mingli
    Xu, Shibiao
    Bai, Yancheng
    APPLIED INTELLIGENCE, 2022, 52 (04) : 4582 - 4598
  • [49] Lightweight intrusion detection model based on CNN and knowledge distillation
    Wang, Long-Hui
    Dai, Qi
    Du, Tony
    Chen, Li-fang
    APPLIED SOFT COMPUTING, 2024, 165
  • [50] A Face Forgery Video Detection Model Based on Knowledge Distillation
    Liang, Haobo
    Leng, Yingxiong
    Luo, Jinman
    Chen, Jie
    Guo, Xiaoji
    27TH IEEE/ACIS INTERNATIONAL SUMMER CONFERENCE ON SOFTWARE ENGINEERING ARTIFICIAL INTELLIGENCE NETWORKING AND PARALLEL/DISTRIBUTED COMPUTING, SNPD 2024-SUMMER, 2024, : 50 - 55