共 50 条
- [1] A new perspective for understanding generalization gap of deep neural networks trained with large batch sizes Applied Intelligence, 2023, 53 : 15621 - 15637
- [2] Train longer, generalize better: closing the generalization gap in large batch training of neural networks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
- [4] Understanding and Utilizing Deep Neural Networks Trained with Noisy Labels INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
- [5] Taming the Noisy Gradient: Train Deep Neural Networks with Small Batch Sizes PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4348 - 4354
- [6] Closing the Generalization Gap of Adaptive Gradient Methods in Training Deep Neural Networks PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3267 - 3275
- [7] New Perspective of Interpretability of Deep Neural Networks 2020 3RD INTERNATIONAL CONFERENCE ON INFORMATION AND COMPUTER TECHNOLOGIES (ICICT 2020), 2020, : 78 - 85
- [9] Understanding Attention and Generalization in Graph Neural Networks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [10] Open set task augmentation facilitates generalization of deep neural networks trained on small data sets Neural Computing and Applications, 2022, 34 : 6067 - 6083