Boosting Lightweight CNNs Through Network Pruning and Knowledge Distillation for SAR Target Recognition

被引:15
|
作者
Wang, Zhen [1 ]
Du, Lan [1 ]
Li, Yi [1 ]
机构
[1] Xidian Univ, Natl Lab Radar Signal Proc, Xian 710071, Peoples R China
基金
美国国家科学基金会;
关键词
Convolutional neural network (CNN); knowledge distillation; model compression; network pruning; synthetic aperture radar (SAR) target recognition; CLASSIFICATION;
D O I
10.1109/JSTARS.2021.3104267
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep convolutional neural networks (CNNs) have yielded unusually brilliant results in synthetic aperture radar (SAR) target recognition. However, overparameterization is a widely-recognized property of deep CNNs, and most previous works excessively pursued high accuracy but neglected the requirement of model deployment in radar systems, i.e., small computations and low memory cost. Therefore, further research on lightweight CNNs for SAR target recognition is necessary. In this article, we devise an effective CNN with channel-wise attention mechanism for SAR target recognition and then compress the network structure and recover lightweight network performance through network pruning and knowledge distillation, respectively. The attention values produced by the network are utilized to evaluate the importance of convolution kernels, and unimportant kernels are pruned. In addition, a novel bridge connection based knowledge distillation method is proposed. Instead of directly mimicking the hidden layer output or artificially designing a function to extract the knowledge in hidden layers, bridge connections are introduced to distill internal knowledge via teacher network. Experiments are conducted on the moving and stationary target acquisition and recognition benchmark dataset. The proposed network has excellent generalization performance and reaches an accuracy of 99.46% on the classification of ten-class targets without any data augmentation. Furthermore, through the network pruning and knowledge distillation algorithm, we cut down 90% parameters of the proposed CNN while maintaining model performance.
引用
收藏
页码:8386 / 8397
页数:12
相关论文
共 50 条
  • [21] Local Pruning Global Pruned Network Under Knowledge Distillation
    Luo, Hanyi
    Tang, Hao
    Zhan, Kun
    INTERNATIONAL CONFERENCE ON COMPUTER VISION, APPLICATION, AND DESIGN (CVAD 2021), 2021, 12155
  • [22] A lightweight crack segmentation network based on knowledge distillation
    Wang, Wenjun
    Su, Chao
    Han, Guohui
    Zhang, Heng
    JOURNAL OF BUILDING ENGINEERING, 2023, 76
  • [23] Lightweight Neural Network With Knowledge Distillation for CSI Feedback
    Cui, Yiming
    Guo, Jiajia
    Cao, Zheng
    Tang, Huaze
    Wen, Chao-Kai
    Jin, Shi
    Wang, Xin
    Hou, Xiaolin
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2024, 72 (08) : 4917 - 4929
  • [24] Lightweight Alpha Matting Network Using Distillation-Based Channel Pruning
    Yoon, Donggeun
    Park, Jinsun
    Cho, Donghyeon
    COMPUTER VISION - ACCV 2022, PT III, 2023, 13843 : 103 - 119
  • [25] What, Where, and How to Transfer in SAR Target Recognition Based on Deep CNNs
    Huang, Zhongling
    Pan, Zongxu
    Lei, Bin
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (04): : 2324 - 2336
  • [26] Knowledge distillation based lightweight domain adversarial neural network for electroencephalogram-based emotion recognition
    Wang, Zhe
    Wang, Yongxiong
    Tang, Yiheng
    Pan, Zhiqun
    Zhang, Jiapeng
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 95
  • [27] PocketNet: Extreme Lightweight Face Recognition Network Using Neural Architecture Search and Multistep Knowledge Distillation
    Boutros, Fadi
    Siebke, Patrick
    Klemt, Marcel
    Damer, Naser
    Kirchbuchner, Florian
    Kuijper, Arjan
    IEEE ACCESS, 2022, 10 : 46823 - 46833
  • [28] A lightweight speech recognition method with target-swap knowledge distillation for Mandarin air traffic control communications
    Ren, Jin
    Yang, Shunzhi
    Shi, Yihua
    Yang, Jinfeng
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [29] A lightweight speech recognition method with target-swap knowledge distillation for Mandarin air traffic control communications
    Ren J.
    Yang S.
    Shi Y.
    Yang J.
    PeerJ Computer Science, 2023, 9
  • [30] Efficient Scene Text Detection in Images with Network Pruning and Knowledge Distillation
    Orenbas, Halit
    Oymagil, Anil
    Baydar, Melih
    29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,