TAKD: Target-Aware Knowledge Distillation for Remote Sensing Scene Classification

被引:2
|
作者
Wu, Jie [1 ]
Fang, Leyuan [2 ,3 ]
Yue, Jun [4 ]
机构
[1] Hunan Univ, Coll Elect & Informat Engn, Changsha 410082, Peoples R China
[2] Hunan Univ, Coll Elect & Informat Engn, Changsha 410082, Peoples R China
[3] Peng Cheng Lab, Shenzhen 518000, Peoples R China
[4] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
基金
中国国家自然科学基金;
关键词
Image classification; remote sensing imagery; knowledge distillation; lightweight model; NETWORK;
D O I
10.1109/TCSVT.2024.3391018
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Remote sensing (RS) scene classification based on deep neural networks (DNNs) has recently drawn remarkable attention. However, the DNNs contain a great number of parameters and require a huge amount of computational costs, which are hard to deploy on edge devices such as onboard embedded systems. To address this issue, in this paper, we propose a target-aware knowledge distillation (TAKD) method for RS scene classification. By considering the characteristics among the target and background regions of the RS images, the TAKD can adaptively distill the knowledge from the teacher model to create a lightweight student model. Specifically, we first introduce a target extraction module that utilizes heatmaps to highlight target regions on the teacher's feature maps. Next, we propose an adaptive fusion module that aggregates these heatmaps to capture objects with varying scales. Finally, we design a target-aware loss that enables the transfer of knowledge in the target regions from the teacher model to the student model, greatly reducing background disturbance. Our distillation scheme that does not require extra learning parameters is both simple and effective, significantly improving the accuracy of the student model without any additional computational or resource costs. Our experiments on three benchmark datasets demonstrate that our proposed TAKD outperforms the existing state-of-the-art distillation methods.
引用
收藏
页码:8188 / 8200
页数:13
相关论文
共 50 条
  • [1] Knowledge Distillation via the Target-aware Transformer
    Lin, Sihao
    Xie, Hongwei
    Wang, Bing
    Yu, Kaicheng
    Chang, Xiaojun
    Liang, Xiaodan
    Wang, Gang
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10905 - 10914
  • [2] Lightweight remote sensing scene classification based on knowledge distillation
    Zhang, Chong-Yang
    Wang, Bin
    JOURNAL OF INFRARED AND MILLIMETER WAVES, 2024, 43 (05) : 684 - 695
  • [3] A CNN-TRANSFORMER KNOWLEDGE DISTILLATION FOR REMOTE SENSING SCENE CLASSIFICATION
    Nabi, Mostaan
    Maggiolo, Luca
    Moser, Gabriele
    Serpico, Sebastiano B.
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 663 - 666
  • [4] A Double Knowledge Distillation Model for Remote Sensing Image Scene Classification
    Li D.
    Nan Y.
    Liu Y.
    Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2023, 45 (10): : 3558 - 3567
  • [5] Multidimensional knowledge distillation for multimodal scene classification of remote sensing images
    Fan, Xiaomin
    Zhou, Wujie
    DIGITAL SIGNAL PROCESSING, 2025, 157
  • [6] Knowledge Distillation of Grassmann Manifold Network for Remote Sensing Scene Classification
    Tian, Ling
    Wang, Zhichao
    He, Bokun
    He, Chu
    Wang, Dingwen
    Li, Deshi
    REMOTE SENSING, 2021, 13 (22)
  • [7] FROM COARSE TO FINE: KNOWLEDGE DISTILLATION FOR REMOTE SENSING SCENE CLASSIFICATION
    Ji, Jinsheng
    Xi, Xiaoming
    Lu, Xiankai
    Guo, Yiyou
    Xie, Huan
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 5427 - 5430
  • [8] Multispectral-to-RGB Knowledge Distillation for Remote Sensing Image Scene Classification
    Shin, Hong-Kyu
    Uhm, Kwang-Hyun
    Jung, Seung-Won
    Ko, Sung-Jea
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [9] Remote Sensing Image Scene Classification Model Based on Dual Knowledge Distillation
    Li, Daxiang
    Nan, Yixuan
    Liu, Ying
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [10] Efficient knowledge distillation using a shift window target-aware transformer
    Feng, Jing
    Ong, Wen Eng
    APPLIED INTELLIGENCE, 2025, 55 (03)