共 50 条
- [23] HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3126 - 3136
- [24] A Multi-Domain Role Activation Model 2017 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2017,
- [25] A Multi-Domain Physical Model Design Optimization Method Considering Uncertainties PROCEEDINGS OF 2019 IEEE 8TH JOINT INTERNATIONAL INFORMATION TECHNOLOGY AND ARTIFICIAL INTELLIGENCE CONFERENCE (ITAIC 2019), 2019, : 1678 - 1682
- [26] Model Selection - Knowledge Distillation Framework for Model Compression 2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
- [27] Patient Knowledge Distillation for BERT Model Compression 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 4323 - 4332
- [28] Triplet Knowledge Distillation Networks for Model Compression 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
- [29] Private Model Compression via Knowledge Distillation THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1190 - +
- [30] Analysis of Model Compression Using Knowledge Distillation IEEE ACCESS, 2022, 10 : 85095 - 85105