QMTS: Fixed-point Quantization for Multiple-timescale Spiking Neural Networks

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Downloads (Pure)

Abstract

Spiking Neural Networks (SNNs) represent a promising solution for streaming applications at the edge that have strict performance and energy requirements. However, implementing SNNs efficiently at the edge requires model quantization to reduce memory and compute requirements. In this paper, we provide methods to quantize a prominent neuron model for temporally rich problems, the parameterized Adaptive Leaky-Integrate-and-Fire (p-ALIF). p-ALIF neurons combine the computational simplicity of Integrate-and-Fire neurons, with accurate learning at multiple timescales, activation sparsity, and increased dynamic range, due to adaptation and heterogeneity. p-ALIF neurons have shown state-of-the-art (SoTA) performance on temporal tasks such as speech recognition and health monitoring. Our method, QMTS, separates SNN quantization into two stages, allowing one to explore different quantization levels efficiently. QMTS search heuristics are tailored for leaky heterogeneous neurons. We demonstrate QMTS on several temporal benchmarks, showing up to 40x memory reduction and 4x sparser synaptic operations with little accuracy loss, compared to 32-bit float.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2023
Subtitle of host publication32nd International Conference on Artificial Neural Networks, Heraklion, Crete, Greece, September 26–29, 2023, Proceedings, Part I
EditorsLazaros Iliadis, Antonios Papaleonidas, Plamen Angelov, Chrisina Jayne
Place of PublicationCham
PublisherSpringer
Pages407-419
Number of pages13
ISBN (Electronic)978-3-031-44207-0
ISBN (Print)978-3-031-44206-3
DOIs
Publication statusPublished - 22 Sept 2023
Event32nd International Conference on Artificial Neural Networks, ICANN 2023 - Heraklion, Greece
Duration: 26 Sept 202329 Sept 2023

Publication series

NameLecture Notes in Computer Science (LNCS)
Volume14254
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference32nd International Conference on Artificial Neural Networks, ICANN 2023
Country/TerritoryGreece
CityHeraklion
Period26/09/2329/09/23

Funding

Acknowledgement. This work has been funded by the Dutch Organization for Scientific Research (NWO) as part of P16-25 eDL project 7.

FundersFunder number
Nederlandse Organisatie voor Wetenschappelijk Onderzoek

    Keywords

    • neuromorphic computing
    • quantization
    • spiking neural networks

    Fingerprint

    Dive into the research topics of 'QMTS: Fixed-point Quantization for Multiple-timescale Spiking Neural Networks'. Together they form a unique fingerprint.

    Cite this