共 50 条
- [1] Neural Compatibility Modeling with Attentive Knowledge Distillation ACM/SIGIR PROCEEDINGS 2018, 2018, : 5 - 14
- [2] Channel Planting for Deep Neural Networks using Knowledge Distillation 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 7573 - 7579
- [3] Explaining Knowledge Distillation by Quantifying the Knowledge 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, : 12922 - 12932
- [6] Knowledge distillation on neural networks for evolving graphs Social Network Analysis and Mining, 2021, 11
- [8] IMF: Integrating Matched Features Using Attentive Logit in Knowledge Distillation PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 974 - +
- [9] RELIANT: Fair Knowledge Distillation for Graph Neural Networks PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
- [10] Adaptively Denoising Graph Neural Networks for Knowledge Distillation MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-RESEARCH TRACK AND DEMO TRACK, PT VIII, ECML PKDD 2024, 2024, 14948 : 253 - 269