A Virtual Knowledge Distillation via Conditional GAN

被引:6
|
作者
Kim, Sihwan [1 ]
机构
[1] Hana Inst Technol, Big Data & AI Lab, Seoul 06133, South Korea
来源
IEEE ACCESS | 2022年 / 10卷
关键词
Training; Generators; Knowledge engineering; Bridges; Generative adversarial networks; Task analysis; Collaborative work; Image classification; model compression; knowledge distillation; self-knowledge distillation; collaborative learning; conditional generative adversarial network;
D O I
10.1109/ACCESS.2022.3163398
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Knowledge distillation aims at transferring the knowledge from a pre-trained complex model, called teacher, to a relatively smaller and faster one, called student. Unlike previous works that transfer the teacher's softened distributions or feature spaces, in this paper, we propose a novel approach, called Virtual Knowledge Distillation (VKD), that transfers a softened distribution generated by a virtual knowledge generator conditioned on class label. A virtual knowledge generator is trained independently, but concurrently with a teacher, to mimic the teacher's softened distributions. Afterwards, when training a student, virtual knowledge generator can be exploited instead of the teacher's softened distributions or combined with the existing distillation methods in a straightforward manner. Moreover, with slight modifications, VKD can be utilized not only for the self-knowledge distillation method but also for the collaborative learning method. We compare our method with several representative distillation methods in various combinations of teacher and student architectures on the image classification tasks. Experimental results on various image classification tasks demonstrate that VKD show a competitive performance compared to the conventional distillation methods, and when combined with them, the performance is improved with a substantial margin.
引用
收藏
页码:34766 / 34778
页数:13
相关论文
共 50 条
  • [21] Knowledge Distillation via Channel Correlation Structure
    Li, Bo
    Chen, Bin
    Wang, Yunxiao
    Dai, Tao
    Hu, Maowei
    Jiang, Yong
    Xia, Shutao
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 357 - 368
  • [22] Ensembled CTR Prediction via Knowledge Distillation
    Zhu, Jieming
    Liu, Jinyang
    Li, Weiqi
    Lai, Jincai
    He, Xiuqiang
    Chen, Liang
    Zheng, Zibin
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2941 - 2948
  • [23] Knowledge Distillation via Constrained Variational Inference
    Saeedi, Ardavan
    Utsumi, Yuria
    Sun, Li
    Batmanghelich, Kayhan
    Lehman, Li-wei H.
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 8132 - 8140
  • [24] Generalized Knowledge Distillation via Relationship Matching
    Ye, Han-Jia
    Lu, Su
    Zhan, De-Chuan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (02) : 1817 - 1834
  • [25] GAN-Knowledge Distillation for One-Stage Object Detection
    Wang, Wanwei
    Hong, Wei
    Wang, Feng
    Yu, Jinke
    IEEE ACCESS, 2020, 8 : 60719 - 60727
  • [26] Lightweight Intrusion Detection System with GAN-based Knowledge Distillation
    Ali, Tarek
    Eleyan, Amna
    Bejaoui, Tarek
    Al-Khalidi, Mohammed
    2024 INTERNATIONAL CONFERENCE ON SMART APPLICATIONS, COMMUNICATIONS AND NETWORKING, SMARTNETS-2024, 2024,
  • [27] Logitwise Distillation Network: Improving Knowledge Distillation via Introducing Sample Confidence
    Shen, Teng
    Cui, Zhenchao
    Qi, Jing
    APPLIED SCIENCES-BASEL, 2025, 15 (05):
  • [28] Learning Conditional Knowledge Distillation for Degraded-Reference Image Quality Assessment
    Zheng, Heliang
    Yang, Huan
    Fu, Jianlong
    Zha, Zheng-Jun
    Luo, Jiebo
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10222 - 10231
  • [29] Generating Long Financial Report using Conditional Variational Autoencoders with Knowledge Distillation
    Ren, Yunpeng
    Wang, Ziao
    Wang, Yiyuan
    Zhang, Xiaofeng
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 15879 - 15880
  • [30] Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation
    Ding, Fei
    Yang, Yin
    Hu, Hongxin
    Krovi, Venkat
    Luo, Feng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2425 - 2435