Low-Precision Stochastic Gradient Langevin Dynamics

被引:0
|
作者
Zhang, Ruqi [1 ]
Wilson, Andrew Gordon [2 ]
De Sa, Christopher [3 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
[2] NYU, New York, NY 10012 USA
[3] Cornell Univ, Ithaca, NY 14853 USA
来源
INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162 | 2022年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While low-precision optimization has been widely used to accelerate deep learning, low-precision sampling remains largely unexplored. As a consequence, sampling is simply infeasible in many large-scale scenarios, despite providing remarkable benefits to generalization and uncertainty estimation for neural networks. In this paper, we provide the first study of low-precision Stochastic Gradient Langevin Dynamics (SGLD), showing that its costs can be significantly reduced without sacrificing performance, due to its intrinsic ability to handle system noise. We prove that the convergence of low-precision SGLD with full-precision gradient accumulators is less affected by the quantization error than its SGD counterpart in the strongly convex setting. To further enable low-precision gradient accumulators, we develop a new quantization function for SGLD that preserves the variance in each update step. We demonstrate that low-precision SGLD achieves comparable performance to full-precision SGLD with only 8 bits on a variety of deep learning tasks.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] On Stochastic Roundoff Errors in Gradient Descent with Low-Precision Computation
    Lu Xia
    Stefano Massei
    Michiel E. Hochstenbach
    Barry Koren
    Journal of Optimization Theory and Applications, 2024, 200 : 634 - 668
  • [2] On Stochastic Roundoff Errors in Gradient Descent with Low-Precision Computation
    Xia, Lu
    Massei, Stefano
    Hochstenbach, Michiel E.
    Koren, Barry
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 200 (02) : 634 - 668
  • [3] Understanding and Optimizing Asynchronous Low-Precision Stochastic Gradient Descent
    De Sa, Christopher
    Feldman, Matthew
    Re, Christopher
    Olukotun, Kunle
    44TH ANNUAL INTERNATIONAL SYMPOSIUM ON COMPUTER ARCHITECTURE (ISCA 2017), 2017, : 561 - 574
  • [4] Stochastic Gradient Langevin Dynamics with Variance Reduction
    Huang, Zhishen
    Becker, Stephen
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [5] The promises and pitfalls of Stochastic Gradient Langevin Dynamics
    Brosse, Nicolas
    Moulines, Eric
    Durmus, Alain
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [6] Consistency and Fluctuations For Stochastic Gradient Langevin Dynamics
    Teh, Yee Whye
    Thiery, Alexandre H.
    Vollmer, Sebastian J.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [8] Variance Reduction in Stochastic Gradient Langevin Dynamics
    Dubey, Avinava
    Reddi, Sashank J.
    Poczos, Barnabas
    Smola, Alexander J.
    Xing, Eric P.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [9] Stochastic gradient Langevin dynamics with adaptive drifts
    Kim, Sehwan
    Song, Qifan
    Liang, Faming
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2022, 92 (02) : 318 - 336
  • [10] SWALP: Stochastic Weight Averaging in Low-Precision Training
    Yang, Guandao
    Zhang, Tianyi
    Kirichenko, Polina
    Bai, Junwen
    Wilson, Andrew Gordon
    De Sa, Christopher
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97