共 50 条
- [31] Up to 100 x Faster Data-Free Knowledge Distillation THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6597 - 6604
- [33] Double-Generators Network for Data-Free Knowledge Distillation Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2023, 60 (07): : 1615 - 1627
- [34] Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation 2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021), 2021, : 1429 - 1437
- [35] Dual discriminator adversarial distillation for data-free model compression International Journal of Machine Learning and Cybernetics, 2022, 13 : 1213 - 1230
- [37] Frequency Domain Distillation for Data-Free Quantization of Vision Transformer PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 205 - 216
- [38] Model Conversion via Differentially Private Data-Free Distillation PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 2187 - 2195
- [39] Augmented Geometric Distillation for Data-Free Incremental Person ReID 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 7319 - 7328
- [40] Data-free Knowledge Distillation based on GNN for Node Classification DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2024, PT 2, 2025, 14851 : 243 - 258