Navigating Local Minima in Quantized Spiking Neural Networks

Eshraghian, Jason K., Lammie, Corey, Azghadi, Mostafa Rahimi, and Lu, Wei D. (2022) Navigating Local Minima in Quantized Spiking Neural Networks. In: Proceedings of the IEEE International Conference on Artificial Intelligence Circuits and Systems. pp. 352-355. From: AICAS 2022: IEEE International Conference on Artificial Intelligence Circuits and Systems, 13-15 June 2022, Incheon, Republic of Korea.

[img] PDF (Publisher Accepted Version) - Accepted Version
Restricted to Repository staff only

View at Publisher Website: https://doi.org/10.1109/AICAS54282.2022....
 
3
1


Abstract

Spiking and Quantized Neural Networks (NNs) are becoming exceedingly important for hyper-efficient implementations of Deep Learning (DL) algorithms. However, these networks face challenges when trained using error backpropagation, due to the absence of gradient signals when applying hard thresholds. The broadly accepted trick to overcoming this is through the use of biased gradient estimators: surrogate gradients which approximate thresholding in Spiking Neural Networks (SNNs), and Straight-Through Estimators (STEs), which completely by-pass thresholding in Quantized Neural Networks (QNNs). While noisy gradient feedback has enabled reasonable performance on simple supervised learning tasks, it is thought that such noise increases the difficulty of finding optima in loss landscapes, especially during the later stages of optimization. By periodically boosting the Learning Rate (LR) during training, we expect the network can navigate unexplored solution spaces that would otherwise be difficult to reach due to local minima, barriers, or flat surfaces. This paper presents a systematic evaluation of a cosine-annealed LR schedule coupled with weight-independent adaptive moment estimation as applied to Quantized SNNs (QSNNs). We provide a rigorous empirical evaluation of this technique on high precision and 4-bit quantized SNNs across three datasets, demonstrating state-of-the-art performance on the more complex datasets. Our source code is available at this link: https://github.com/jeshraghian/QSNNs.

Item ID: 76558
Item Type: Conference Item (Research - E1)
ISBN: 9781665409964
Keywords: Deep learning, quantization, scheduling, spiking neural networks
Copyright Information: © 2022 IEEE
Date Deposited: 22 Feb 2023 00:28
FoR Codes: 46 INFORMATION AND COMPUTING SCIENCES > 4611 Machine learning > 461104 Neural networks @ 100%
SEO Codes: 28 EXPANDING KNOWLEDGE > 2801 Expanding knowledge > 280110 Expanding knowledge in engineering @ 100%
Downloads: Total: 1
More Statistics

Actions (Repository Staff Only)

Item Control Page Item Control Page