共 50 条
- [43] FreeKD: Free-direction Knowledge Distillation for Graph Neural Networks PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 357 - 366
- [45] Knowledge Distillation Improves Graph Structure Augmentation for Graph Neural Networks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [47] Knowledge Reverse Distillation Based Confidence Calibration for Deep Neural Networks Neural Processing Letters, 2023, 55 : 345 - 360
- [48] Compressing Deep Graph Neural Networks via Adversarial Knowledge Distillation PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 534 - 544
- [50] Feature Distribution-based Knowledge Distillation for Deep Neural Networks 2022 19TH INTERNATIONAL SOC DESIGN CONFERENCE (ISOCC), 2022, : 75 - 76