共 50 条
- [41] Revisiting Knowledge Distillation via Label Smoothing Regularization 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 3902 - 3910
- [42] DECENTRALIZED FEDERATED LEARNING VIA MUTUAL KNOWLEDGE DISTILLATION 2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 342 - 347
- [43] SKDBERT: Compressing BERT via Stochastic Knowledge Distillation THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7414 - 7422
- [44] Knowledge Distillation via Hypersphere Features Distribution Transfer PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4229 - 4233
- [45] Federated Split Learning via Mutual Knowledge Distillation IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 2729 - 2741
- [47] Efficient Biomedical Instance Segmentation via Knowledge Distillation MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT IV, 2022, 13434 : 14 - 24
- [49] Boosting LightWeight Depth Estimation via Knowledge Distillation KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, KSEM 2023, 2023, 14117 : 27 - 39