Efficient and effective training of sparse recurrent neural networks

Onderzoeksoutput: Bijdrage aan tijdschriftTijdschriftartikelAcademicpeer review

23 Citaten (Scopus)
109 Downloads (Pure)

Samenvatting

Recurrent neural networks (RNNs) have achieved state-of-the-art performances on various applications. However, RNNs are prone to be memory-bandwidth limited in practical applications and need both long periods of training and inference time. The aforementioned problems are at odds with training and deploying RNNs on resource-limited devices where the memory and floating-point operations (FLOPs) budget are strictly constrained. To address this problem, conventional model compression techniques usually focus on reducing inference costs, operating on a costly pre-trained model. Recently, dynamic sparse training has been proposed to accelerate the training process by directly training sparse neural networks from scratch. However, previous sparse training techniques are mainly designed for convolutional neural networks and multi-layer perceptron. In this paper, we introduce a method to train intrinsically sparse RNN models with a fixed number of parameters and floating-point operations (FLOPs) during training. We demonstrate state-of-the-art sparse performance with long short-term memory and recurrent highway networks on widely used tasks, language modeling, and text classification. We simply use the results to advocate that, contrary to the general belief that training a sparse neural network from scratch leads to worse performance than dense networks, sparse training with adaptive connectivity can usually achieve better performance than dense models for RNNs.
Originele taal-2Engels
Pagina's (van-tot)9625-9636
Aantal pagina's12
TijdschriftNeural Computing and Applications
Volume33
Nummer van het tijdschrift15
Vroegere onlinedatum26 jan. 2021
DOI's
StatusGepubliceerd - aug. 2021

Vingerafdruk

Duik in de onderzoeksthema's van 'Efficient and effective training of sparse recurrent neural networks'. Samen vormen ze een unieke vingerafdruk.

Citeer dit