共 50 条
- [42] On the Convergence of (Stochastic) Gradient Descent with Extrapolation for Non-Convex Minimization PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4003 - 4009
- [43] Conditions for linear convergence of the gradient method for non-convex optimization Optimization Letters, 2023, 17 : 1105 - 1125
- [48] On Convergence of Heuristics Based on Douglas-Rachford Splitting and ADMM to Minimize Convex Functions over Nonconvex Sets 2018 56TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2018, : 56 - 63
- [50] Linear convergence of the generalized Douglas–Rachford algorithm for feasibility problems Journal of Global Optimization, 2018, 72 : 443 - 474