共 50 条
- [2] Stability vs Implicit Bias of Gradient Methods on Separable Data and Beyond CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
- [3] Convergence of Gradient Descent on Separable Data 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
- [6] Tight Risk Bounds for Gradient Descent on Separable Data ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [7] Gradient Descent Converges Linearly for Logistic Regression on Separable Data INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
- [9] Commentary: To underfit and to overfit the data. This is the dilemma JOURNAL OF THORACIC AND CARDIOVASCULAR SURGERY, 2020, 160 (01): : 183 - 183