Variational Self-Distillation for Remote Sensing Scene Classification

被引:20
|
作者
Hu, Yutao [1 ]
Huang, Xin [1 ]
Luo, Xiaoyan [2 ]
Han, Jungong [3 ]
Cao, Xianbin [1 ,4 ]
Zhang, Jun [5 ]
机构
[1] Beihang Univ, Sch Elect & Informat Engn, Beijing 100191, Peoples R China
[2] Beihang Univ, Sch Astronaut, Beijing 100191, Peoples R China
[3] Aberystwyth Univ, Dept Comp Sci, Aberystwyth SY23 3FL, Dyfed, Wales
[4] Minist Ind & Informat Technol China, Key Lab Adv Technol Near Space Informat Syst, Beijing 100804, Peoples R China
[5] Beijing Inst Technol, Adv Res Inst Multidisciplinary Sci, Beijing 100081, Peoples R China
基金
中国国家自然科学基金;
关键词
Remote sensing; Training; Representation learning; Uncertainty; Perturbation methods; Knowledge transfer; Computational modeling; Class entanglement information; hierarchical knowledge transfer; remote sensing scene classification; self-distillation; CONVOLUTIONAL NEURAL-NETWORKS;
D O I
10.1109/TGRS.2022.3194549
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Supported by deep learning techniques, remote sensing scene classification, a fundamental task in remote image analysis, has recently obtained remarkable progress. However, due to the severe uncertainty and perturbation within an image, it is still a challenging task and remains many unsolved problems. In this article, we note that regular one-hot labels cannot precisely describe remote sensing images, and they fail to provide enough information for supervision and limiting the discriminative feature learning of the network. To solve this problem, we propose a variational self-distillation network (VSDNet), in which the class entanglement information from the prediction vector acts as the supplement to the category information. Then, the exploited information is hierarchically distilled from the deep layers into the shallow parts via a variational knowledge transfer (VKT) module. Notably, the VKT module performs knowledge distillation in a probabilistic way through variational estimation, which enables end-to-end optimization for mutual information and promotes robustness to uncertainty within the image. Extensive experiments on four challenging remote sensing datasets demonstrate that, with a negligible parameter increase, the proposed VSDNet brings a significant performance improvement over different backbone networks and delivers state-of-the-art results.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Class-Aware Self-Distillation for Remote Sensing Image Scene Classification
    Wu, Bin
    Hao, Siyuan
    Wang, Wei
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 2173 - 2188
  • [2] Embedded Self-Distillation in Compact Multibranch Ensemble Network for Remote Sensing Scene Classification
    Zhao, Qi
    Ma, Yujing
    Lyu, Shuchang
    Chen, Lijiang
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [3] LaST: Label-Free Self-Distillation Contrastive Learning With Transformer Architecture for Remote Sensing Image Scene Classification
    Wang, Xuying
    Zhu, Jiawei
    Yan, Zhengliang
    Zhang, Zhaoyang
    Zhang, Yunsheng
    Chen, Yansheng
    Li, Haifeng
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [4] Learn by Yourself: A Feature-Augmented Self-Distillation Convolutional Neural Network for Remote Sensing Scene Image Classification
    Shi, Cuiping
    Ding, Mengxiang
    Wang, Liguo
    Pan, Haizhu
    REMOTE SENSING, 2023, 15 (23)
  • [5] Self-Supervision and Self-Distillation with Multilayer Feature Contrast for Supervision Collapse in Few-Shot Remote Sensing Scene Classification
    Zhou, Haonan
    Du, Xiaoping
    Li, Sen
    REMOTE SENSING, 2022, 14 (13)
  • [6] Remote Sensing Image Scene Classification with Noisy Label Distillation
    Zhang, Rui
    Chen, Zhenghao
    Zhang, Sanxing
    Song, Fei
    Zhang, Gang
    Zhou, Quancheng
    Lei, Tao
    REMOTE SENSING, 2020, 12 (15)
  • [7] Lightweight remote sensing scene classification based on knowledge distillation
    Zhang, Chong-Yang
    Wang, Bin
    JOURNAL OF INFRARED AND MILLIMETER WAVES, 2024, 43 (05) : 684 - 695
  • [8] Tolerant Self-Distillation for image classification
    Liu, Mushui
    Yu, Yunlong
    Ji, Zhong
    Han, Jungong
    Zhang, Zhongfei
    NEURAL NETWORKS, 2024, 174
  • [9] Image classification based on self-distillation
    Yuting Li
    Linbo Qing
    Xiaohai He
    Honggang Chen
    Qiang Liu
    Applied Intelligence, 2023, 53 : 9396 - 9408
  • [10] Image classification based on self-distillation
    Li, Yuting
    Qing, Linbo
    He, Xiaohai
    Chen, Honggang
    Liu, Qiang
    APPLIED INTELLIGENCE, 2023, 53 (08) : 9396 - 9408