How to exploit sparsity in RNNs on event-driven architectures

Jarno Brils, Luc Waeijen, Arash Pourtaherian

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

Event-driven architectures have been shown to provide low-power, low-latency artificial neural network (ANN) inference. This is especially beneficial on Edge devices, particularly when combined with sparse execution. Recurrent neural networks (RNNs) are ANNs that emulate memory. Their recurrent connection enables the reuse of previous output for the generation of new output. However, when trying to use RNNs in a sparse context on event-driven architectures, novel challenges in synchronization and the usage of sparse data are encountered. In this work, these challenges are systematically analyzed, and mechanisms to overcome them are proposed. Experimental results of a monocular depth estimation use case on the NeuronFlow architecture show that sparsity in RNNs can be exploited effectively on event-driven architectures.

Original languageEnglish
Title of host publicationProceedings of the 24th International Workshop on Software and Compilers for Embedded Systems, SCOPES 2021
EditorsSander Stuijk
PublisherAssociation for Computing Machinery, Inc
Pages17-22
Number of pages6
ISBN (Electronic)978-1-4503-9166-5
DOIs
Publication statusPublished - 1 Nov 2021
Event24th International Workshop on Software and Compilers for Embedded Systems, SCOPES 2021 - Virtual, Online, Eindhoven, Netherlands
Duration: 1 Nov 20212 Nov 2021
Conference number: 24
https://scopesconf.org/scopes-21/

Conference

Conference24th International Workshop on Software and Compilers for Embedded Systems, SCOPES 2021
Abbreviated titleSCOPES 2021
Country/TerritoryNetherlands
CityEindhoven
Period1/11/212/11/21
Internet address

Funding

The authors acknowledge Funda??o para a Ci?ncia e a Tecnologia (FCT) for financial support (grant numbers SFRH/BD/44108/2008, SFRH/BD/78639/2011 and SFRH/BPD/91397/2012 and projects PTDC/EQU-ERQ/102771/2008 and project UID/QUI/00100/2013).

Keywords

  • asynchronous
  • event-driven
  • monocular depth estimation
  • neural networks
  • recurrent neural networks
  • sparsity

Fingerprint

Dive into the research topics of 'How to exploit sparsity in RNNs on event-driven architectures'. Together they form a unique fingerprint.

Cite this