Learning Instance-wise Sparsity for Accelerating Deep Models

被引:0
|
作者
Liu, Chuanjian [1 ]
Wang, Yunhe [1 ]
Han, Kai [1 ]
Xu, Chunjing [1 ]
Xu, Chang [2 ]
机构
[1] Huawei Noahs Ark Lab, Beijing, Peoples R China
[2] Univ Sydney, FEIT, Sch Comp Sci, Sydney, NSW, Australia
基金
澳大利亚研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Exploring deep convolutional neural networks of high efficiency and low memory usage is very essential for a wide variety of machine learning tasks. Most of existing approaches used to accelerate deep models by manipulating parameters or filters without data, e.g., pruning and decomposition. In contrast, we study this problem from a different perspective by respecting the difference between data. An instance-wise feature pruning is developed by identifying informative features for different instances. Specifically, by investigating a feature decay regularization, we expect intermediate feature maps of each instance in deep neural networks to be sparse while preserving the overall network performance. During online inference, subtle features of input images extracted by intermediate layers of a well-trained neural network can be eliminated to accelerate the subsequent calculations. We further take coefficient of variation as a measure to select the layers that are appropriate for acceleration. Extensive experiments conducted on benchmark datasets and networks demonstrate the effectiveness of the proposed method.
引用
收藏
页码:3001 / 3007
页数:7
相关论文
共 50 条
  • [1] iHAS: Instance-wise Hierarchical Architecture Search for Deep Learning Recommendation Models
    Yu, Yakun
    Qi, Shi-Ang
    Yang, Jiuding
    Jiang, Liyao
    Niu, Di
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3030 - 3039
  • [2] Learning to Transform for Generalizable Instance-wise Invariance
    Singhal, Utkarsh
    Esteves, Carlos
    Makadia, Ameesh
    Yu, Stella X.
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6188 - 6198
  • [3] Instance-Wise Laplace Mechanism via Deep Reinforcement Learning (Student Abstract)
    Ryu, Sehyun
    Joo, Hosung
    Jang, Jonggyu
    Yang, Hyun Jong
    THIRTY-EIGTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 21, 2024, : 23640 - 23641
  • [4] Instance-wise multi-view representation learning
    Li, Dan
    Wang, Haibao
    Wang, Yufeng
    Wang, Shengpei
    INFORMATION FUSION, 2023, 91 : 612 - 622
  • [5] Instance-wise Feature Grouping
    Masoomi, Aria
    Wu, Chieh
    Zhao, Tingting
    Wang, Zifeng
    Castaldi, Peter
    Dy, Jennifer
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [6] Learning to Unlearn: Instance-Wise Unlearning for Pre-trained Classifiers
    Cha, Sungmin
    Cho, Sungjun
    Hwang, Dasol
    Lee, Honglak
    Moon, Taesup
    Lee, Moontae
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 10, 2024, : 11186 - 11194
  • [7] Self-Supervised Video Representation Learning Using Improved Instance-Wise Contrastive Learning and Deep Clustering
    Zhu, Yisheng
    Shuai, Hui
    Liu, Guangcan
    Liu, Qingshan
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (10) : 6741 - 6752
  • [8] Synthetic Model Combination: An Instance-wise Approach to Unsupervised Ensemble Learning
    Chan, Alex J.
    van der Schaar, Mihaela
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] Instance-wise Grasp Synthesis for Robotic Grasping
    Xu, Yucheng
    Kasaei, Mohammadreza
    Kasaei, Hamidreza
    Li, Zhibin
    Proceedings - IEEE International Conference on Robotics and Automation, 2023, 2023-May : 1744 - 1750
  • [10] Class-wise and instance-wise contrastive learning for zero-shot learning based on VAEGAN
    Zheng, Baolong
    Li, Zhanshan
    Li, Jingyao
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 272