共 50 条
- [42] A Stochastic Momentum Accelerated Quasi-Newton Method for Neural Networks THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 12973 - 12974
- [45] A Unified Analysis of Stochastic Momentum Methods for Deep Learning PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2955 - 2961
- [46] HSB-GDM: a Hybrid Stochastic-Binary Circuit for Gradient Descent with Momentum in the Training of Neural Networks PROCEEDINGS OF THE 17TH ACM INTERNATIONAL SYMPOSIUM ON NANOSCALE ARCHITECTURES, NANOARCH 2022, 2022,
- [48] AUTOMATIC AND SIMULTANEOUS ADJUSTMENT OF LEARNING RATE AND MOMENTUM FOR STOCHASTIC GRADIENT-BASED OPTIMIZATION METHODS 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3127 - 3131
- [49] Minibatch Stochastic Approximate Proximal Point Methods ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
- [50] Stochastic Subspace Cubic Newton Method INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119