Boosting Lightweight CNNs Through Network Pruning and Knowledge Distillation for SAR Target Recognition

被引:15
|
作者
Wang, Zhen [1 ]
Du, Lan [1 ]
Li, Yi [1 ]
机构
[1] Xidian Univ, Natl Lab Radar Signal Proc, Xian 710071, Peoples R China
基金
美国国家科学基金会;
关键词
Convolutional neural network (CNN); knowledge distillation; model compression; network pruning; synthetic aperture radar (SAR) target recognition; CLASSIFICATION;
D O I
10.1109/JSTARS.2021.3104267
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep convolutional neural networks (CNNs) have yielded unusually brilliant results in synthetic aperture radar (SAR) target recognition. However, overparameterization is a widely-recognized property of deep CNNs, and most previous works excessively pursued high accuracy but neglected the requirement of model deployment in radar systems, i.e., small computations and low memory cost. Therefore, further research on lightweight CNNs for SAR target recognition is necessary. In this article, we devise an effective CNN with channel-wise attention mechanism for SAR target recognition and then compress the network structure and recover lightweight network performance through network pruning and knowledge distillation, respectively. The attention values produced by the network are utilized to evaluate the importance of convolution kernels, and unimportant kernels are pruned. In addition, a novel bridge connection based knowledge distillation method is proposed. Instead of directly mimicking the hidden layer output or artificially designing a function to extract the knowledge in hidden layers, bridge connections are introduced to distill internal knowledge via teacher network. Experiments are conducted on the moving and stationary target acquisition and recognition benchmark dataset. The proposed network has excellent generalization performance and reaches an accuracy of 99.46% on the classification of ten-class targets without any data augmentation. Furthermore, through the network pruning and knowledge distillation algorithm, we cut down 90% parameters of the proposed CNN while maintaining model performance.
引用
收藏
页码:8386 / 8397
页数:12
相关论文
共 50 条
  • [1] Multilevel Adaptive Knowledge Distillation Network for Incremental SAR Target Recognition
    Yu, Xuelian
    Dong, Fulu
    Ren, Haohao
    Zhang, Chengfa
    Zou, Lin
    Zhou, Yun
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [2] A Lightweight SAR Ship Detection Network Based on Deep Multiscale Grouped Convolution, Network Pruning, and Knowledge Distillation
    Hu, Boyi
    Miao, Hongxia
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2025, 18 : 2190 - 2207
  • [3] Learning Slimming SAR Ship Object Detector Through Network Pruning and Knowledge Distillation
    Chen, Shiqi
    Zhan, Ronghui
    Wang, Wei
    Zhang, Jun
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2021, 14 : 1267 - 1282
  • [4] TGNet: A Lightweight Infrared Thermal Image Gesture Recognition Network Based on Knowledge Distillation and Model Pruning
    Chen, L.
    Sun, Q.
    Xu, Z.
    Liao, Y.
    2024 CROSS STRAIT RADIO SCIENCE AND WIRELESS TECHNOLOGY CONFERENCE, CSRSWTC 2024, 2024, : 96 - 98
  • [5] Lightweight SAR target detection based on channel pruning and know-ledge distillation
    Huang Q.
    Jin G.
    Xiong X.
    Wang L.
    Li J.
    Cehui Xuebao/Acta Geodaetica et Cartographica Sinica, 2024, 53 (04): : 712 - 723
  • [6] Lightweight detection network for bridge defects based on model pruning and knowledge distillation
    Guan, Bin
    Li, Junjie
    STRUCTURES, 2024, 62
  • [7] Boosting LightWeight Depth Estimation via Knowledge Distillation
    Hu, Junjie
    Fan, Chenyou
    Jiang, Hualie
    Guo, Xiyue
    Gao, Yuan
    Lu, Xiangyong
    Lam, Tin Lun
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, KSEM 2023, 2023, 14117 : 27 - 39
  • [8] A Lightweight Fully Convolutional Neural Network for SAR Automatic Target Recognition
    Yu, Jimin
    Zhou, Guangyu
    Zhou, Shangbo
    Yin, Jiajun
    REMOTE SENSING, 2021, 13 (15)
  • [9] A Lightweight Identification Method for Complex Power Industry Tasks Based on Knowledge Distillation and Network Pruning
    Wang, Wendi
    Zhou, Xiangling
    Jiang, Chengling
    Zhu, Hong
    Yu, Hao
    Wang, Shufan
    PROCESSES, 2023, 11 (09)
  • [10] Deep network pruning: A comparative study on CNNs in face recognition
    Alonso-Fernandez, Fernando
    Hernandez-Diaz, Kevin
    Rubio, Jose Maria Buades
    Tiwari, Prayag
    Bigun, Josef
    PATTERN RECOGNITION LETTERS, 2025, 189 : 221 - 228