Heterogeneous Knowledge Distillation Using Conceptual Learning

被引:0
|
作者
Yu, Yerin [1 ]
Kim, Namgyu [1 ]
机构
[1] Kookmin Univ, Grad Sch Business IT, Seoul 02707, South Korea
基金
新加坡国家研究基金会;
关键词
Knowledge distillation; conceptual learning; deep learning; pretrained model; model compression;
D O I
10.1109/ACCESS.2024.3387459
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent advances in deep learning have led to the development of large, high-performing models that have been pretrained on massive datasets. However, employing these models in real-world services requires fast inference speed and low computational complexity. This has driven an interest in model compression techniques, such as knowledge distillation, which transfers the knowledge learned by a teacher model to a smaller student model. However, traditional knowledge distillation models are limited in that the student learns from the teacher model only the knowledge needed to solve the given problem. Therefore, giving an appropriate answer to a case that has not been encountered yet is difficult. In this study, we propose a heterogeneous knowledge distillation method that distills knowledge through a teacher model that has obtained knowledge about higher concepts, not the knowledge that needs to be obtained. The proposed methodology is based on the pedagogical discovery that problems can be solved better by learning not only specific knowledge about the problem but also general knowledge of higher concepts. In particular, beyond the limitations where traditional knowledge distillation was only capable of transferring knowledge for the same task, one can anticipate performance enhancement in lightweight models and extended applicability of pre-trained teacher models through the transfer of heterogeneous knowledge using the proposed methodology. In addition, through classification experiments on 70,000 images from the machine learning benchmark dataset Fashion-MNIST, we confirmed that the proposed heterogeneous knowledge distillation methodology achieves superior performance in terms of classification accuracy than does traditional knowledge distillation.
引用
收藏
页码:52803 / 52814
页数:12
相关论文
共 50 条
  • [1] Elastic Deep Learning Using Knowledge Distillation with Heterogeneous Computing Resources
    Dong, Daxiang
    Liu, Ji
    Wang, Xi
    Gong, Weibao
    Qin, An
    Li, Xingjian
    Yu, Dianhai
    Valduriez, Patrick
    Dou, Dejing
    EURO-PAR 2021: PARALLEL PROCESSING WORKSHOPS, 2022, 13098 : 116 - 128
  • [2] Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning
    Chadha, Mohak
    Khera, Pulkit
    Gu, Jianfeng
    Abboud, Osama
    Gerndt, Michael
    39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 997 - 1006
  • [3] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
  • [4] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [5] Heterogeneous Knowledge Distillation using Information Flow Modeling
    Passalis, N.
    Tzelepi, M.
    Tefas, A.
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 2336 - 2345
  • [6] Heterogeneous Federated Learning Framework for IIoT Based on Selective Knowledge Distillation
    Guo, Sheng
    Chen, Hui
    Liu, Yang
    Yang, Chengyi
    Li, Zengxiang
    Jin, Cheng Hao
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2025, 21 (02) : 1078 - 1089
  • [7] FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation
    Chen, Leiming
    Zhang, Weishan
    Dong, Cihao
    Zhao, Dehai
    Zeng, Xingjie
    Qiao, Sibo
    Zhu, Yichang
    Tan, Chee Wei
    ENTROPY, 2024, 26 (01)
  • [8] FEDGKD: Toward Heterogeneous Federated Learning via Global Knowledge Distillation
    Yao, Dezhong
    Pan, Wanning
    Dai, Yutong
    Wan, Yao
    Ding, Xiaofeng
    Yu, Chen
    Jin, Hai
    Xu, Zheng
    Sun, Lichao
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (01) : 3 - 17
  • [9] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning
    Lyu, Feng
    Tang, Cheng
    Deng, Yongheng
    Liu, Tong
    Zhang, Yongmin
    Zhang, Yaoxue
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
  • [10] Keyword Spotting with Synthetic Data using Heterogeneous Knowledge Distillation
    Lee, Yuna
    Baek, Seung Jun
    INTERSPEECH 2022, 2022, : 1397 - 1401