Knowledge Distillation in Fourier Frequency Domain for Dense Prediction

被引:0
|
作者
Shi, Min [1 ]
Zheng, Chengkun [1 ]
Yi, Qingming [1 ]
Weng, Jian [1 ,2 ]
Luo, Aiwen [1 ]
机构
[1] Jinan Univ, Coll Informat Sci & Technol, Dept Elect Engn, Guangzhou 510632, Peoples R China
[2] Jinan Univ, Coll Cyber Secur, Guangzhou 510632, Peoples R China
基金
中国国家自然科学基金;
关键词
Frequency-domain analysis; Feature extraction; Semantics; Knowledge engineering; Detectors; Accuracy; Technological innovation; Object detection; Head; Discrete Fourier transforms; Dense prediction; Fourier transform; knowledge distillation; object detection; semantic segmentation;
D O I
10.1109/LSP.2024.3515795
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Knowledge distillation has been widely used to enhance student network performance for dense prediction tasks. Most previous knowledge distillation methods focus on valuable regions of the feature map in the spatial domain, ignoring the semantic information in the frequency domain. This work explores effective information representation of feature maps in the frequency domain and proposes a novel distillation method in the Fourier domain. This approach enhances the student's amplitude representation and transmits both original feature knowledge and global pixel relations. Experiments on object detection and semantic segmentation tasks, including both homogeneous distillation and heterogeneous distillation, demonstrate the significant improvement for the student network. For instance, the ResNet50-RepPoints detector and ResNet18-PspNet segmenter achieve 4.2% AP and 5.01% mIoU improvements on COCO2017 and CityScapes datasets, respectively.
引用
收藏
页码:296 / 300
页数:5
相关论文
共 50 条
  • [21] Lightweight Spectrum Prediction Based on Knowledge Distillation
    Cheng, Runmeng
    Zhang, Jianzhao
    Deng, Junquan
    Zhu, Yanping
    RADIOENGINEERING, 2023, 32 (04) : 469 - 478
  • [22] Communication Traffic Prediction with Continual Knowledge Distillation
    Li, Hang
    Wang, Ju
    Hu, Chengming
    Chen, Xi
    Liu, Xue
    Jang, Seowoo
    Dudek, Gregory
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 5481 - 5486
  • [23] Embracing the Dark Knowledge: Domain Generalization Using Regularized Knowledge Distillation
    Wang, Yufei
    Li, Haoliang
    Chau, Lap-pui
    Kot, Alex C.
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2595 - 2604
  • [24] Knowledge Distillation for Semi-supervised Domain Adaptation
    Orbes-Arteainst, Mauricio
    Cardoso, Jorge
    Sorensen, Lauge
    Igel, Christian
    Ourselin, Sebastien
    Modat, Marc
    Nielsen, Mads
    Pai, Akshay
    OR 2.0 CONTEXT-AWARE OPERATING THEATERS AND MACHINE LEARNING IN CLINICAL NEUROIMAGING, 2019, 11796 : 68 - 76
  • [25] MSTNet-KD: Multilevel Transfer Networks Using Knowledge Distillation for the Dense Prediction of Remote-Sensing Images
    Zhou, Wujie
    Li, Yangzhen
    Huan, Juan
    Liu, Yuanyuan
    Jiang, Qiuping
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 12
  • [26] Cross-domain recommendation via knowledge distillation
    Li, Xiuze
    Huang, Zhenhua
    Wu, Zhengyang
    Wang, Changdong
    Chen, Yunwen
    KNOWLEDGE-BASED SYSTEMS, 2025, 311
  • [27] Representation Learning and Knowledge Distillation for Lightweight Domain Adaptation
    Bin Shah, Sayed Rafay
    Putty, Shreyas Subhash
    Schwung, Andreas
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 1202 - 1207
  • [28] DaFKD: Domain-aware Federated Knowledge Distillation
    Wang, Haozhao
    Li, Yichen
    Xu, Wenchao
    Li, Ruixuan
    Zhan, Yufeng
    Zeng, Zhigang
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 20412 - 20421
  • [29] Cross-domain knowledge distillation for text classification
    Zhang, Shaokang
    Jiang, Lei
    Tan, Jianlong
    NEUROCOMPUTING, 2022, 509 : 11 - 20
  • [30] Joint Progressive Knowledge Distillation and Unsupervised Domain Adaptation
    Nguyen-Meidine, Le Thanh
    Granger, Eric
    Kiran, Madhu
    Dolz, Jose
    Blais-Morin, Louis-Antoine
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,