Static Hand Gesture Recognition for American Sign Language using Neuromorphic Hardware
In this paper, we develop four spiking neural network (SNN) models for two static American Sign Language (ASL) hand gesture classification tasks, i.e., the ASL Alphabet and ASL Digits. The SNN models are deployed on Intel's neuromorphic platform, Loihi, and then compared against equivalent deep neural network (DNN) models deployed on an edge computing device, the Intel Neural Compute Stick 2 (NCS2). We perform a comprehensive comparison between the two systems in terms of accuracy, latency, power consumption, and energy. The best DNN model achieves an accuracy of 99.93 the best performing SNN model has an accuracy of 99.30 dataset, the best DNN model achieves an accuracy of 99.76 SNN achieves 99.03 Loihi neuromorphic hardware implementations achieve up to 20.64x and 4.10x reduction in power consumption and energy, respectively, when compared to NCS2.
READ FULL TEXT