An Accurate EEGNet-based Motor-Imagery Brain-Computer Interface for Low-Power Edge Computing
This paper presents an accurate and robust embedded motor-imagery brain-computer interface (MI-BCI). The proposed novel model, based on EEGNet, matches the requirements of memory footprint and computational resources of low-power microcontroller units (MCUs), such as the ARM Cortex-M family. Furthermore, the paper presents a set of methods, including temporal downsampling, channel selection, and narrowing of the classification window, to further scale down the model to relax memory requirements with negligible accuracy degradation. Experimental results on the Physionet EEG Motor Movement/Imagery Dataset show that standard EEGNet achieves 82.43 65.07 validation, outperforming the state-of-the-art (SoA) convolutional neural network (CNN) by 2.05 the standard EEGNet at a negligible accuracy loss of 0.31 footprint reduction and a small accuracy loss of 2.51 scaled models are deployed on a commercial Cortex-M4F MCU taking 101ms and consuming 1.9mJ per inference for operating the smallest model, and on a Cortex-M7 with 44ms and 6.2mJ per inference for the medium-sized model, enabling a fully autonomous, wearable, and accurate low-power BCI.
READ FULL TEXT