Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science

D.C. Mocanu, E. Mocanu, P. Stone, H.P. Nguyen, M. Gibescu, A. Liotta

Onderzoeksoutput: Bijdrage aan tijdschriftTijdschriftartikelAcademicpeer review

475 Citaten (Scopus)
817 Downloads (Pure)

Samenvatting

Through the success of deep learning in various domains, artificial neural networks are currently among the most used artificial intelligence methods. Taking inspiration from the network properties of biological neural networks (e.g. sparsity, scale-freeness), we argue that (contrary to general practice) artificial neural networks, too, should not have fully-connected layers. Here we propose sparse evolutionary training of artificial neural networks, an algorithm which evolves an initial sparse topology (Erdos-Rényi random graph) of two consecutive layers of neurons into a scale-free topology, during learning. Our method replaces artificial neural networks fully-connected layers with sparse ones before training, reducing quadratically the number of parameters, with no decrease in accuracy. We demonstrate our claims on restricted Boltzmann machines, multi-layer perceptrons, and convolutional neural networks for unsupervised and supervised learning on 15 datasets. Our approach has the potential to enable artificial neural networks to scale up beyond what is currently possible.

Originele taal-2Engels
Artikelnummer2383
Aantal pagina's12
TijdschriftNature Communications
Volume9
Nummer van het tijdschrift1
Vroegere onlinedatum15 jul. 2017
DOI's
StatusGepubliceerd - 19 jun. 2018

Vingerafdruk

Duik in de onderzoeksthema's van 'Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science'. Samen vormen ze een unieke vingerafdruk.
  • Selfish Sparse RNN Training

    Liu, S., Mocanu, D. C., Pei, Y. & Pechenizkiy, M., 18 jul. 2021, Proceedings of the 38th International Conference on Machine Learning (ICML2021) . PMLR, Vol. 139. blz. 6893--6904 139

    Onderzoeksoutput: Hoofdstuk in Boek/Rapport/CongresprocedureConferentiebijdrageAcademicpeer review

    Open Access
  • Sparse Training via Boosting Pruning Plasticity with Neuroregeneration

    Liu, S., Chen, T., Chen, X., Atashgahi, Z., Yin, L., Kou, H., Shen, L., Pechenizkiy, M., Wang, Z. & Mocanu, D. C., 2021, Advances in Neural Information Processing Systems 34 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021. Ranzato, M., Beygelzimer, A., Dauphin, Y., Liang, P. S. & Wortman Vaughan, J. (uitgave). Neural information processing systems foundation, blz. 9908-9922 15 blz. (Advances in Neural Information Processing Systems; vol. 12).

    Onderzoeksoutput: Hoofdstuk in Boek/Rapport/CongresprocedureConferentiebijdrageAcademicpeer review

    Open Access

Citeer dit