共 50 条
- [31] A Lightweight Method for Graph Neural Networks Based on Knowledge Distillation and Graph Contrastive Learning APPLIED SCIENCES-BASEL, 2024, 14 (11):
- [32] DHBE: Data-free Holistic Backdoor Erasing in Deep Neural Networks via Restricted Adversarial Distillation PROCEEDINGS OF THE 2023 ACM ASIA CONFERENCE ON COMPUTER AND COMMUNICATIONS SECURITY, ASIA CCS 2023, 2023, : 731 - 745
- [33] Robust Graph Neural Networks Against Adversarial Attacks via Jointly Adversarial Training IFAC PAPERSONLINE, 2020, 53 (05): : 420 - 425
- [34] DATA-FREE WATERMARK FOR DEEP NEURAL NETWORKS BY TRUNCATED ADVERSARIAL DISTILLATION 2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 4480 - 4484
- [35] Extract the Knowledge of Graph Neural Networks and Go Beyond it: An Effective Knowledge Distillation Framework PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1227 - 1237
- [36] KDGAN: Knowledge Distillation with Generative Adversarial Networks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
- [37] Research on Knowledge Distillation of Generative Adversarial Networks 2021 DATA COMPRESSION CONFERENCE (DCC 2021), 2021, : 376 - 376
- [38] Application of Knowledge Distillation in Generative Adversarial Networks 2023 3RD ASIA-PACIFIC CONFERENCE ON COMMUNICATIONS TECHNOLOGY AND COMPUTER SCIENCE, ACCTCS, 2023, : 65 - 71