Knowledge Distillation in Fourier Frequency Domain for Dense Prediction

被引:0
|
作者
Shi, Min [1 ]
Zheng, Chengkun [1 ]
Yi, Qingming [1 ]
Weng, Jian [1 ,2 ]
Luo, Aiwen [1 ]
机构
[1] Jinan Univ, Coll Informat Sci & Technol, Dept Elect Engn, Guangzhou 510632, Peoples R China
[2] Jinan Univ, Coll Cyber Secur, Guangzhou 510632, Peoples R China
基金
中国国家自然科学基金;
关键词
Frequency-domain analysis; Feature extraction; Semantics; Knowledge engineering; Detectors; Accuracy; Technological innovation; Object detection; Head; Discrete Fourier transforms; Dense prediction; Fourier transform; knowledge distillation; object detection; semantic segmentation;
D O I
10.1109/LSP.2024.3515795
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Knowledge distillation has been widely used to enhance student network performance for dense prediction tasks. Most previous knowledge distillation methods focus on valuable regions of the feature map in the spatial domain, ignoring the semantic information in the frequency domain. This work explores effective information representation of feature maps in the frequency domain and proposes a novel distillation method in the Fourier domain. This approach enhances the student's amplitude representation and transmits both original feature knowledge and global pixel relations. Experiments on object detection and semantic segmentation tasks, including both homogeneous distillation and heterogeneous distillation, demonstrate the significant improvement for the student network. For instance, the ResNet50-RepPoints detector and ResNet18-PspNet segmenter achieve 4.2% AP and 5.01% mIoU improvements on COCO2017 and CityScapes datasets, respectively.
引用
收藏
页码:296 / 300
页数:5
相关论文
共 50 条
  • [1] Structured Knowledge Distillation for Dense Prediction
    Liu, Yifan
    Shu, Changyong
    Wang, Jingdong
    Shen, Chunhua
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (06) : 7035 - 7049
  • [2] Channel-wise Knowledge Distillation for Dense Prediction
    Shu, Changyong
    Liu, Yifan
    Gao, Jianfei
    Yan, Zheng
    Shen, Chunhua
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 5291 - 5300
  • [3] Multi-Task Learning with Knowledge Distillation for Dense Prediction
    Xu, Yangyang
    Yang, Yibo
    Zhang, Lefei
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 21493 - 21502
  • [4] Target Category Agnostic Knowledge Distillation With Frequency-Domain Supervision
    Tang, Wenxiao
    Shakeel, M. Saad
    Chen, Zisheng
    Wan, Hao
    Kang, Wenxiong
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (07) : 8462 - 8471
  • [5] Generative Denoise Distillation: Simple stochastic noises induce efficient knowledge transfer for dense prediction
    Liu, Zhaoge
    Xu, Xiaohao
    Cao, Yunkang
    Shen, Weiming
    KNOWLEDGE-BASED SYSTEMS, 2024, 302
  • [6] Knowledge transfer via distillation from time and frequency domain for time series classification
    Ouyang, Kewei
    Hou, Yi
    Zhang, Ye
    Ma, Chao
    Zhou, Shilin
    APPLIED INTELLIGENCE, 2023, 53 (02) : 1505 - 1516
  • [7] Knowledge transfer via distillation from time and frequency domain for time series classification
    Kewei Ouyang
    Yi Hou
    Ye Zhang
    Chao Ma
    Shilin Zhou
    Applied Intelligence, 2023, 53 : 1505 - 1516
  • [8] A self-distillation object segmentation method via frequency domain knowledge augmentation
    Chen, Lei
    Cao, Tieyong
    Zheng, Yunfei
    Fang, Zheng
    IET COMPUTER VISION, 2023, 17 (03) : 341 - 351
  • [9] Knowledge distillation of multi-scale dense prediction transformer for self-supervised depth estimation
    Song, Jimin
    Lee, Sang Jun
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [10] Knowledge distillation of multi-scale dense prediction transformer for self-supervised depth estimation
    Jimin Song
    Sang Jun Lee
    Scientific Reports, 13