共 50 条
- [1] Compression of Acoustic Model via Knowledge Distillation and Pruning 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2785 - 2790
- [3] PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation INTERSPEECH 2021, 2021, : 4568 - 4572
- [4] Private Knowledge Transfer via Model Distillation with Generative Adversarial Networks ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1794 - 1801
- [5] Knowledge Distillation Beyond Model Compression 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6136 - 6143
- [7] Model compression via pruning and knowledge distillation for person re-identification Journal of Ambient Intelligence and Humanized Computing, 2021, 12 : 2149 - 2161
- [8] Model Selection - Knowledge Distillation Framework for Model Compression 2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
- [9] Patient Knowledge Distillation for BERT Model Compression 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 4323 - 4332
- [10] Triplet Knowledge Distillation Networks for Model Compression 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,