A tucker decomposition based knowledge distillation for intelligent edge applications

被引:17
|
作者
Dai, Cheng [1 ,2 ]
Liu, Xingang [1 ]
Li, Zhuolin [1 ]
Chen, Mu-Yen [3 ]
机构
[1] Univ Elect Sci & Technol China UESTC, Sch Informat & Commun Engn, Chengdu, Peoples R China
[2] McMaster Univ, Dept Elect Engn & Comp Sci, Hamilton, ON L8S 4K1, Canada
[3] Natl Cheng Kung Univ, Dept Engn Sci, Tainan, Taiwan
基金
中国国家自然科学基金;
关键词
Knowledge distillation; Intelligent edge computing; Deep learning; Tensor decomposition; DEEP COMPUTATION MODEL;
D O I
10.1016/j.asoc.2020.107051
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation(KD) has been proven an effective method in intelligent edge computing and have achieved extensive study in recent deep learning research. However, when the teacher network is too stronger compared to the student network, the effect of knowledge distillation is not ideal. Aiming at resolving this problem, an improved method of knowledge distillation (TDKD) is proposed, which enables to transfer the complex mapping functions learned by cumbersome models to relatively simpler models. Firstly, the tucker-2 decomposition was performed on the convolutional layers of the original teacher model to reduce the capacity variance between the teacher network and student network. Then, the decomposed model will be used as a new teacher to participate in knowledge distillation for the student model. The experimental results show that the TDKD method can effectively solve the problem of poor distillation performance, which not only get better results if the KD method is effective, but also can reactivate the invalid KD method to some extents. (C) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Tucker decomposition and applications
    Bhatt, Vineet
    Kumar, Sunil
    Saini, Seema
    MATERIALS TODAY-PROCEEDINGS, 2021, 46 : 10787 - 10792
  • [2] A light-weight skeleton human action recognition model with knowledge distillation for edge intelligent surveillance applications
    Dai, Cheng
    Lu, Shoupeng
    Liu, Chuanjie
    Guo, Bing
    APPLIED SOFT COMPUTING, 2024, 151
  • [3] Tucker decomposition-based temporal knowledge graph completion
    Shao, Pengpeng
    Zhang, Dawei
    Yang, Guohua
    Tao, Jianhua
    Che, Feihu
    Liu, Tong
    KNOWLEDGE-BASED SYSTEMS, 2022, 238
  • [4] Adaptive Knowledge Distillation-Based Lightweight Intelligent Fault Diagnosis Framework in IoT Edge Computing
    Wang, Yanzhi
    Yu, Ziyang
    Wu, Jinhong
    Wang, Chu
    Zhou, Qi
    Hu, Jiexiang
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (13): : 23156 - 23169
  • [5] Teacher-student knowledge distillation based on decomposed deep feature representation for intelligent mobile applications
    Sepahvand, Majid
    Abdali-Mohammadi, Fardin
    Taherkordi, Amir
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 202
  • [6] KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING
    Mohamed, Ahmed P.
    Fameel, Abu Shafin Mohammad Mandee
    El Gamal, Aly
    2021 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2021, : 600 - 604
  • [7] Echo State Network Based on Improved Knowledge Distillation for Edge Intelligence
    Jian ZHOU
    Yuwen JIANG
    Lijie XU
    Lu ZHAO
    Fu XIAO
    Chinese Journal of Electronics, 2024, 33 (01) : 101 - 111
  • [8] Echo State Network Based on Improved Knowledge Distillation for Edge Intelligence
    Zhou, Jian
    Jiang, Yuwen
    Xu, Lijie
    Zhao, Lu
    Xiao, Fu
    CHINESE JOURNAL OF ELECTRONICS, 2024, 33 (01) : 101 - 111
  • [9] Lightweight Edge-side Fault Diagnosis Based on Knowledge Distillation
    Shang, Yingjun
    Feng, Tao
    Huo, Yonghua
    Duan, Yongcun
    Long, Yuhan
    2022 IEEE 14TH INTERNATIONAL CONFERENCE ON ADVANCED INFOCOMM TECHNOLOGY (ICAIT 2022), 2022, : 348 - 353
  • [10] A Lightweight Pipeline Edge Detection Model Based on Heterogeneous Knowledge Distillation
    Zhu, Chengyuan
    Pu, Yanyun
    Lyu, Zhuoling
    Wu, Aonan
    Yang, Kaixiang
    Yang, Qinmin
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2024, 71 (12) : 5059 - 5063