共 50 条
- [1] Distilling knowledge from Gaussian process teacher to neural network student INTERSPEECH 2023, 2023, : 426 - 430
- [2] Toward linking teacher knowledge and student learning SECOND LANGUAGE TEACHER EDUCATION: INTERNATIONAL PERSPECTIVES, 2005, : 73 - 95
- [3] Distilling Knowledge in Federated Learning 2021 22ND ASIA-PACIFIC NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM (APNOMS), 2021, : 196 - 201
- [4] Distilling the Undistillable: Learning from a Nasty Teacher COMPUTER VISION, ECCV 2022, PT XIII, 2022, 13673 : 587 - 603
- [5] PrUE: Distilling Knowledge from Sparse Teacher Networks MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT III, 2023, 13715 : 102 - 117
- [7] CHED 17-Service learning: Good for the community, good for the teacher, and great for the student ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2009, 238
- [8] Learning Student-Friendly Teacher Networks for Knowledge Distillation ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
- [9] Hybrid Learning with Teacher-student Knowledge Distillation for Recommenders 20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW 2020), 2020, : 227 - 235