共 50 条
- [31] Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [32] Boosting Graph Neural Networks via Adaptive Knowledge Distillation THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7793 - 7801
- [33] Adventitious Respiratory Classification using Attentive Residual Neural Networks INTERSPEECH 2020, 2020, : 2912 - 2916
- [34] On using neural networks models for distillation control DISTILLATION AND ABSORPTION '97, VOLS 1 AND 2, 1997, (142): : 259 - 268
- [36] Explaining the Unexplained: A CLass-Enhanced Attentive Response (CLEAR) Approach to Understanding Deep Neural Networks 2017 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2017, : 1686 - 1694
- [37] A Neural Attentive Model Using Human Semantic Knowledge for Clickbait Detection 2020 IEEE INTL SYMP ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, INTL CONF ON BIG DATA & CLOUD COMPUTING, INTL SYMP SOCIAL COMPUTING & NETWORKING, INTL CONF ON SUSTAINABLE COMPUTING & COMMUNICATIONS (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2020), 2020, : 770 - 776
- [38] Frustratingly Easy Knowledge Distillation via Attentive Similarity Matching 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2357 - 2363
- [39] Improving Neural Topic Models using Knowledge Distillation PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1752 - 1771
- [40] Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1227 - 1237