Evolving and understanding sparse deep neural networks using cosine similarity

J. Pieterse, Decebal Mocanu

Research output: Contribution to journalArticleAcademic

4 Downloads (Pure)

Abstract

Training sparse neural networks with adaptive connectivity is an active research topic. Such networks require less storage and have lower computational complexity compared to their dense counterparts. The Sparse Evolutionary Training (SET) procedure uses weights magnitude to evolve efficiently the topology of a sparse network to fit the dataset, while enabling it to have quadratically less parameters than its dense counterpart. To this end, we propose a novel approach that evolves a sparse network topology based on the behavior of neurons in the network. More exactly, the cosine similarities between the activations of any two neurons are used to determine which connections are added to or removed from the network. By integrating our approach within the SET procedure, we propose 5 new algorithms to train sparse neural networks. We argue that our approach has low additional computational complexity and we draw a parallel to Hebbian learning. Experiments are performed on 8 datasets taken from various domains to demonstrate the general applicability of our approach. Even without optimizing hyperparameters for specific datasets, the experiments show that our proposed training algorithms usually outperform SET and state-of-the-art dense neural network techniques. The last but not the least, we show that the evolved connectivity patterns of the input neurons reflect their impact on the classification task.
Original languageEnglish
Article number1903.07138v1
Number of pages14
JournalarXiv
Publication statusPublished - 17 Mar 2019

Fingerprint

Neurons
Neural networks
Computational complexity
Topology
Experiments
Chemical activation
Deep neural networks

Cite this

@article{000f2f79139e40169eee834d9b786edb,
title = "Evolving and understanding sparse deep neural networks using cosine similarity",
abstract = "Training sparse neural networks with adaptive connectivity is an active research topic. Such networks require less storage and have lower computational complexity compared to their dense counterparts. The Sparse Evolutionary Training (SET) procedure uses weights magnitude to evolve efficiently the topology of a sparse network to fit the dataset, while enabling it to have quadratically less parameters than its dense counterpart. To this end, we propose a novel approach that evolves a sparse network topology based on the behavior of neurons in the network. More exactly, the cosine similarities between the activations of any two neurons are used to determine which connections are added to or removed from the network. By integrating our approach within the SET procedure, we propose 5 new algorithms to train sparse neural networks. We argue that our approach has low additional computational complexity and we draw a parallel to Hebbian learning. Experiments are performed on 8 datasets taken from various domains to demonstrate the general applicability of our approach. Even without optimizing hyperparameters for specific datasets, the experiments show that our proposed training algorithms usually outperform SET and state-of-the-art dense neural network techniques. The last but not the least, we show that the evolved connectivity patterns of the input neurons reflect their impact on the classification task.",
author = "J. Pieterse and Decebal Mocanu",
year = "2019",
month = "3",
day = "17",
language = "English",
journal = "arXiv",
publisher = "Cornell University Library",

}

Evolving and understanding sparse deep neural networks using cosine similarity. / Pieterse, J.; Mocanu, Decebal.

In: arXiv, 17.03.2019.

Research output: Contribution to journalArticleAcademic

TY - JOUR

T1 - Evolving and understanding sparse deep neural networks using cosine similarity

AU - Pieterse, J.

AU - Mocanu, Decebal

PY - 2019/3/17

Y1 - 2019/3/17

N2 - Training sparse neural networks with adaptive connectivity is an active research topic. Such networks require less storage and have lower computational complexity compared to their dense counterparts. The Sparse Evolutionary Training (SET) procedure uses weights magnitude to evolve efficiently the topology of a sparse network to fit the dataset, while enabling it to have quadratically less parameters than its dense counterpart. To this end, we propose a novel approach that evolves a sparse network topology based on the behavior of neurons in the network. More exactly, the cosine similarities between the activations of any two neurons are used to determine which connections are added to or removed from the network. By integrating our approach within the SET procedure, we propose 5 new algorithms to train sparse neural networks. We argue that our approach has low additional computational complexity and we draw a parallel to Hebbian learning. Experiments are performed on 8 datasets taken from various domains to demonstrate the general applicability of our approach. Even without optimizing hyperparameters for specific datasets, the experiments show that our proposed training algorithms usually outperform SET and state-of-the-art dense neural network techniques. The last but not the least, we show that the evolved connectivity patterns of the input neurons reflect their impact on the classification task.

AB - Training sparse neural networks with adaptive connectivity is an active research topic. Such networks require less storage and have lower computational complexity compared to their dense counterparts. The Sparse Evolutionary Training (SET) procedure uses weights magnitude to evolve efficiently the topology of a sparse network to fit the dataset, while enabling it to have quadratically less parameters than its dense counterpart. To this end, we propose a novel approach that evolves a sparse network topology based on the behavior of neurons in the network. More exactly, the cosine similarities between the activations of any two neurons are used to determine which connections are added to or removed from the network. By integrating our approach within the SET procedure, we propose 5 new algorithms to train sparse neural networks. We argue that our approach has low additional computational complexity and we draw a parallel to Hebbian learning. Experiments are performed on 8 datasets taken from various domains to demonstrate the general applicability of our approach. Even without optimizing hyperparameters for specific datasets, the experiments show that our proposed training algorithms usually outperform SET and state-of-the-art dense neural network techniques. The last but not the least, we show that the evolved connectivity patterns of the input neurons reflect their impact on the classification task.

UR - https://arxiv.org/abs/1903.07138

M3 - Article

JO - arXiv

JF - arXiv

M1 - 1903.07138v1

ER -