A Simple and Efficient Stochastic Rounding Method for Training Neural Networks in Low Precision

Research output: Working paperAcademic

101 Downloads (Pure)

Abstract

Conventional stochastic rounding (CSR) is widely employed in the training of neural networks (NNs), showing promising training results even in low-precision computations. We introduce an improved stochastic rounding method, that is simple and efficient. The proposed method succeeds in training NNs with 16-bit fixed-point numbers and provides faster convergence and higher classification accuracy than both CSR and deterministic rounding-to-the-nearest method.
Original languageEnglish
Publication statusPublished - 24 Mar 2021

Fingerprint

Dive into the research topics of 'A Simple and Efficient Stochastic Rounding Method for Training Neural Networks in Low Precision'. Together they form a unique fingerprint.

Cite this