Unknown

Dataset Information

0

Revisiting Batch Normalization for Training Low-Latency Deep Spiking Neural Networks From Scratch.


ABSTRACT: Spiking Neural Networks (SNNs) have recently emerged as an alternative to deep learning owing to sparse, asynchronous and binary event (or spike) driven processing, that can yield huge energy efficiency benefits on neuromorphic hardware. However, SNNs convey temporally-varying spike activation through time that is likely to induce a large variation of forward activation and backward gradients, resulting in unstable training. To address this training issue in SNNs, we revisit Batch Normalization (BN) and propose a temporal Batch Normalization Through Time (BNTT) technique. Different from previous BN techniques with SNNs, we find that varying the BN parameters at every time-step allows the model to learn the time-varying input distribution better. Specifically, our proposed BNTT decouples the parameters in a BNTT layer along the time axis to capture the temporal dynamics of spikes. We demonstrate BNTT on CIFAR-10, CIFAR-100, Tiny-ImageNet, event-driven DVS-CIFAR10 datasets, and Sequential MNIST and show near state-of-the-art performance. We conduct comprehensive analysis on the temporal characteristic of BNTT and showcase interesting benefits toward robustness against random and adversarial noise. Further, by monitoring the learnt parameters of BNTT, we find that we can do temporal early exit. That is, we can reduce the inference latency by ~5 - 20 time-steps from the original training latency. The code has been released at https://github.com/Intelligent-Computing-Lab-Yale/BNTT-Batch-Normalization-Through-Time.

SUBMITTER: Kim Y 

PROVIDER: S-EPMC8695433 | biostudies-literature |

REPOSITORIES: biostudies-literature

Similar Datasets

| S-EPMC8280105 | biostudies-literature
| S-EPMC5738356 | biostudies-literature
| S-EPMC9945199 | biostudies-literature
| S-EPMC9025538 | biostudies-literature
| S-EPMC7010779 | biostudies-literature
| S-EPMC4820126 | biostudies-literature
| S-EPMC7090229 | biostudies-literature
| S-EPMC8099770 | biostudies-literature
| S-EPMC7511202 | biostudies-literature
| S-EPMC5552800 | biostudies-other