Boosting Lightweight CNNs Through Network Pruning and Knowledge Distillation for SAR Target Recognition

被引:15
|
作者
Wang, Zhen [1 ]
Du, Lan [1 ]
Li, Yi [1 ]
机构
[1] Xidian Univ, Natl Lab Radar Signal Proc, Xian 710071, Peoples R China
基金
美国国家科学基金会;
关键词
Convolutional neural network (CNN); knowledge distillation; model compression; network pruning; synthetic aperture radar (SAR) target recognition; CLASSIFICATION;
D O I
10.1109/JSTARS.2021.3104267
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep convolutional neural networks (CNNs) have yielded unusually brilliant results in synthetic aperture radar (SAR) target recognition. However, overparameterization is a widely-recognized property of deep CNNs, and most previous works excessively pursued high accuracy but neglected the requirement of model deployment in radar systems, i.e., small computations and low memory cost. Therefore, further research on lightweight CNNs for SAR target recognition is necessary. In this article, we devise an effective CNN with channel-wise attention mechanism for SAR target recognition and then compress the network structure and recover lightweight network performance through network pruning and knowledge distillation, respectively. The attention values produced by the network are utilized to evaluate the importance of convolution kernels, and unimportant kernels are pruned. In addition, a novel bridge connection based knowledge distillation method is proposed. Instead of directly mimicking the hidden layer output or artificially designing a function to extract the knowledge in hidden layers, bridge connections are introduced to distill internal knowledge via teacher network. Experiments are conducted on the moving and stationary target acquisition and recognition benchmark dataset. The proposed network has excellent generalization performance and reaches an accuracy of 99.46% on the classification of ten-class targets without any data augmentation. Furthermore, through the network pruning and knowledge distillation algorithm, we cut down 90% parameters of the proposed CNN while maintaining model performance.
引用
收藏
页码:8386 / 8397
页数:12
相关论文
共 50 条
  • [31] Robustness-Reinforced Knowledge Distillation With Correlation Distance and Network Pruning
    Kim, Seonghak
    Ham, Gyeongdo
    Cho, Yucheol
    Kim, Daeshik
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (12) : 9163 - 9175
  • [32] Yarn state detection based on lightweight network and knowledge distillation
    Ren G.
    Tu J.
    Li Y.
    Qiu Z.
    Shi W.
    Fangzhi Xuebao/Journal of Textile Research, 2023, 44 (09): : 205 - 212
  • [33] Lightweight Network Traffic Classification Model Based on Knowledge Distillation
    Wu, Yanhui
    Zhang, Meng
    WEB INFORMATION SYSTEMS ENGINEERING - WISE 2021, PT II, 2021, 13081 : 107 - 121
  • [34] Efficient and Controllable Model Compression through Sequential Knowledge Distillation and Pruning
    Malihi, Leila
    Heidemann, Gunther
    BIG DATA AND COGNITIVE COMPUTING, 2023, 7 (03)
  • [35] LSAN: A NOVEL LIGHTWEIGHT SAR AIRCRAFT TARGET DETECTION NETWORK
    Han, Ping
    Bai, Jirui
    Peng, Yanwen
    Yang, Lei
    Han, Binbin
    2024 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2024), 2024, : 9455 - 9459
  • [36] A Lightweight Microscopic Metal Fracture Classification Method With Structure Pruning and Knowledge Distillation for Embedded Devices
    Yan, Han
    Lu, Wei
    Wu, Yuhu
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
  • [37] Multiassistant Knowledge Distillation for Lightweight Bearing Fault Diagnosis Based on Decreasing Threshold Channel Pruning
    Zhong, Hongyu
    Yu, Samson
    Trinh, Hieu
    Lv, Yong
    Yuan, Rui
    Wang, Yanan
    IEEE SENSORS JOURNAL, 2024, 24 (01) : 486 - 494
  • [38] Interpreting Neural Network Pattern With Pruning for PolSAR Target Recognition
    Lin, Huiping
    Yin, Junjun
    Yang, Jian
    Xu, Feng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [39] SAR Target Recognition With Limited Samples Based on Meta Knowledge Transferring Using Relation Network
    Guo, Jun
    Wang, Ling
    Zhu, Daiyin
    Zhang, Gong
    2020 INTERNATIONAL SYMPOSIUM ON ANTENNAS AND PROPAGATION (ISAP), 2021, : 377 - 378
  • [40] Lightweight Transformer Network for Ship HRRP Target Recognition
    Yue, Zhibin
    Lu, Jianbin
    Wan, Lu
    APPLIED SCIENCES-BASEL, 2022, 12 (19):