Small Is Beautiful: Compressing Deep Neural Networks for Partial Domain Adaptation

被引:0
|
作者
Ma, Yuzhe [1 ]
Yao, Xufeng [2 ]
Chen, Ran [2 ]
Li, Ruiyu [3 ]
Shen, Xiaoyong [3 ]
Yu, Bei [2 ]
机构
[1] Hong Kong Univ Sci & Technol Guangzhou, Microelect Thrust, Guangzhou 511400, Peoples R China
[2] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
[3] SmartMore Corp Ltd, Hong Kong, Peoples R China
关键词
Training; Computational modeling; Task analysis; Adaptation models; Deep learning; Taylor series; Supervised learning; neural network compression; transfer learning;
D O I
10.1109/TNNLS.2022.3194533
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain adaptation is a promising way to ease the costly data labeling process in the era of deep learning (DL). A practical situation is partial domain adaptation (PDA), where the label space of the target domain is a subset of that in the source domain. Although existing methods yield appealing performance in PDA tasks, it is highly presumable that computation overhead exists in deep PDA models since the target is only a subtask of the original problem. In this work, PDA and model compression are seamlessly integrated into a unified training process. The cross-domain distribution divergence is reduced by minimizing a soft-weighted maximum mean discrepancy (SWMMD), which is differentiable and functions as regularization during network training. We use gradient statistics to compress the overparameterized model to identify and prune redundant channels based on the corresponding scaling factors in batch normalization (BN) layers. The experimental results demonstrate that our method can achieve comparable classification performance to state-of-the-art methods on various PDA tasks, with a significant reduction in model size and computation overhead.
引用
收藏
页码:3575 / 3585
页数:11
相关论文
共 50 条
  • [41] Multi-Source Domain Adaptation with Fuzzy-Rule based Deep Neural Networks
    Li, Keqiuyin
    Lu, Jie
    Zuo, Hua
    Zhang, Guangquan
    IEEE CIS INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS 2021 (FUZZ-IEEE), 2021,
  • [42] Prediction of Decline in Activities of Daily Living Through Deep Artificial Neural Networks and Domain Adaptation
    Donati, Lorenzo
    Fongo, Daniele
    Cattelani, Luca
    Chesani, Federico
    ADVANCES IN ARTIFICIAL INTELLIGENCE, AI*IA 2019, 2019, 11946 : 376 - 391
  • [43] Compressing DMA Engine: Leveraging Activation Sparsity for Training Deep Neural Networks
    Rhu, Minsoo
    O'Connor, Mike
    Chatterjee, Niladrish
    Pool, Jeff
    Kwon, Youngeun
    Keckler, Stephen W.
    2018 24TH IEEE INTERNATIONAL SYMPOSIUM ON HIGH PERFORMANCE COMPUTER ARCHITECTURE (HPCA), 2018, : 78 - 91
  • [44] Bit-Quantized-Net: An Effective Method for Compressing Deep Neural Networks
    Chunshan Li
    Qing Du
    Xiaofei Xu
    Jinhui Zhu
    Dianhui Chu
    Mobile Networks and Applications, 2021, 26 : 104 - 113
  • [45] Bit-Quantized-Net: An Effective Method for Compressing Deep Neural Networks
    Li, Chunshan
    Du, Qing
    Xu, Xiaofei
    Zhu, Jinhui
    Chu, Dianhui
    MOBILE NETWORKS & APPLICATIONS, 2021, 26 (01): : 104 - 113
  • [46] A Novel Low-Bit Quantization Strategy for Compressing Deep Neural Networks
    Long, Xin
    Zeng, XiangRong
    Ben, Zongcheng
    Zhou, Dianle
    Zhang, Maojun
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2020, 2020 (2020)
  • [47] Compressing fully connected layers of deep neural networks using permuted features
    Nagaraju, Dara
    Chandrachoodan, Nitin
    IET COMPUTERS AND DIGITAL TECHNIQUES, 2023, 17 (3-4): : 149 - 161
  • [48] Deep Adaptation Relation Networks for Across Domain Classification
    Li, Yuze
    Yang, Chunling
    Zhang, Yan
    Chen, Yu
    Zhang, Peng
    PROCEEDINGS OF THE 2019 14TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA 2019), 2019, : 922 - 926
  • [49] Deep domain adaptation via joint transfer networks
    Zhang, Changchun
    Zhao, Qingjie
    Wu, Heng
    NEUROCOMPUTING, 2022, 489 : 441 - 448
  • [50] CANN: Coupled Approximation Neural Network for Partial Domain Adaptation
    Feng, Cheng
    Zhong, Chaoliang
    Wang, Jie
    Sun, Jun
    Yokota, Yasuto
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 464 - 473