共 50 条
- [42] Hardening Deep Neural Networks via Adversarial Model Cascades 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
- [43] Compressing Visual-linguistic Model via Knowledge Distillation 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 1408 - 1418
- [44] Concept Distillation in Graph Neural Networks EXPLAINABLE ARTIFICIAL INTELLIGENCE, XAI 2023, PT III, 2023, 1903 : 233 - 255
- [46] Compressing Deep Neural Networks for Recognizing Places PROCEEDINGS 2017 4TH IAPR ASIAN CONFERENCE ON PATTERN RECOGNITION (ACPR), 2017, : 352 - 357
- [49] Knowledge Reverse Distillation Based Confidence Calibration for Deep Neural Networks Neural Processing Letters, 2023, 55 : 345 - 360
- [50] Feature Distribution-based Knowledge Distillation for Deep Neural Networks 2022 19TH INTERNATIONAL SOC DESIGN CONFERENCE (ISOCC), 2022, : 75 - 76