共 50 条
- [21] Dropout Training, Data-dependent Regularization, and Generalization Bounds INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
- [25] The Loss Surface of Deep and Wide Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
- [26] Norm Loss: An efficient yet effective regularization method for deep neural networks 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8812 - 8818
- [28] On Generalization Bounds of a Family of Recurrent Neural Networks INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1233 - 1242
- [30] Implicit Regularization in Deep Learning May Not Be Explainable by Norms ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33