On improving deep learning generalization with adaptive sparse connectivity

Onderzoeksoutput: Hoofdstuk in Boek/Rapport/CongresprocedureConferentiebijdrageAcademicpeer review

7 Downloads (Pure)

Samenvatting

Large neural networks are very successful in various tasks. However, with limited data, the generalization capabilities of deep neural networks are also very limited. In this paper, we empirically start showing that intrinsically sparse neural networks with adaptive sparse connectivity, which by design have a strict parameter budget during the training phase, have better generalization capabilities than their fully-connected counterparts. Besides this, we propose a new technique to train these sparse models by combining the Sparse Evolutionary Training (SET) procedure with neurons pruning. Operated on MultiLayer Perceptron (MLP) and tested on 15 datasets, our proposed technique zeros out around 50% of the hidden neurons during training, while having a linear number of parameters to optimize with respect to the number of neurons. The results show a competitive classification and generalization performance.
Originele taal-2Engels
TitelICML 2019 Workshop on Understanding and Improving General-ization in Deep Learning
Aantal pagina's5
StatusGepubliceerd - 14 jun 2019

Vingerafdruk Duik in de onderzoeksthema's van 'On improving deep learning generalization with adaptive sparse connectivity'. Samen vormen ze een unieke vingerafdruk.

  • Citeer dit

    Liu, S., Mocanu, D., & Pechenizkiy, M. (2019). On improving deep learning generalization with adaptive sparse connectivity. In ICML 2019 Workshop on Understanding and Improving General-ization in Deep Learning [1906.11626v1 ]