Deep Networks Incorporating Spiking Neural Dynamics
Neural networks have become the key technology of Artificial Intelligence (AI) that contributed to breakthroughs in several machine learning tasks, primarily owing to advances in Artificial Neural Networks (ANNs). Neural networks incorporating spiking neurons have held great promise because of their brain-inspired temporal dynamics and high power efficiency, however scaling out and training deep Spiking Neural Networks (SNNs) remained significant challenges - they still fall behind ANNs in terms of accuracy on many traditional learning tasks. In this paper, we propose an alternative perspective on the spiking neuron as a particular ANN construct called Spiking Neural Unit (SNU). Specifically, the SNU casts the stateful temporal spiking neuron, of a LIF type, as a recurrent ANN unit, analogous to LSTMs and GRUs. Moreover, by introducing the concept of proportional reset, we generalize the LIF dynamics to a variant called soft SNU (sSNU). The proposed family of units has a series of benefits. Firstly, the ANN training methods, such as backpropagation through time, naturally apply to them and enable a simple approach for successful training of deep spiking networks. For example, a 4- or 7-layer SNN trained on a temporal version of MNIST dataset provides higher accuracy in comparison to RNNs, LSTM- and GRU-based networks of similar architecture. Secondly, for the task of polyphonic music prediction on the JSB dataset, an sSNU-based network surpasses the state-of-the-art performance of RNNs, LSTM- and GRU-based networks. The novel family of units introduced in this paper bridges the biologically-inspired SNNs with ANNs. It provides a systematic methodology for implementing and training deep networks incorporating spiking dynamics that achieve accuracies as high, or better than, state-of-the-art ANNs. Thus, it opens a new avenue for the widespread adoption of SNNs to practical applications.
READ FULL TEXT