Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science

D.C. Mocanu, E. Mocanu, P. Stone, H.P. Nguyen, M. Gibescu, A. Liotta

Research output: Contribution to journalArticleAcademicpeer-review

139 Citations (Scopus)
535 Downloads (Pure)


Through the success of deep learning in various domains, artificial neural networks are currently among the most used artificial intelligence methods. Taking inspiration from the network properties of biological neural networks (e.g. sparsity, scale-freeness), we argue that (contrary to general practice) artificial neural networks, too, should not have fully-connected layers. Here we propose sparse evolutionary training of artificial neural networks, an algorithm which evolves an initial sparse topology (Erdos-Rényi random graph) of two consecutive layers of neurons into a scale-free topology, during learning. Our method replaces artificial neural networks fully-connected layers with sparse ones before training, reducing quadratically the number of parameters, with no decrease in accuracy. We demonstrate our claims on restricted Boltzmann machines, multi-layer perceptrons, and convolutional neural networks for unsupervised and supervised learning on 15 datasets. Our approach has the potential to enable artificial neural networks to scale up beyond what is currently possible.

Original languageEnglish
Article number2383
Number of pages12
JournalNature Communications
Issue number1
Early online date15 Jul 2017
Publication statusPublished - 19 Jun 2018


  • complex networks
  • evolutionary algorithms
  • deep learning
  • sparse artificial neural networks
  • restricted Boltzmann machines
  • Sparse Training


Dive into the research topics of 'Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science'. Together they form a unique fingerprint.

Cite this