A Tandem Learning Rule for Efficient and Rapid Inference on Deep Spiking Neural Networks

07/02/2019
by   Jibin Wu, et al.
0

Emerging neuromorphic computing (NC) architectures have shown compelling energy efficiency in machine learning tasks using spiking neural networks (SNNs). However, due to the non-differentiable nature of spiking neuronal functions, the standard error back-propagation algorithm is not directly applicable to SNNs. In this work, we propose a tandem learning framework, that consists of a SNN and an Artificial Neural Network (ANN) that share weights. The ANN is an auxiliary structure that facilitates the error back-propagation for the training of the SNN. To this end, we consider the spike count as the discrete neural representation and design ANN neuronal activation function that can effectively approximate the spike count of the coupled SNN. The SNNs that are trained with the proposed tandem learning rule show competitive classification accuracies on the CIFAR-10 and ImageNet-2012 datasets with significantly reduced inference time and total synaptic operations over other state-of-the-art SNN implementations. The proposed tandem learning rule offers a novel solution to training efficient, low latency and high accuracy deep SNNs with low computing resources.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset