Brain-based control of car infotainment

04/24/2020
by   Andrea Bellotti, et al.
0

Nowadays, the possibility to run advanced AI on embedded systems allows natural interaction between humans and machines, especially in the automotive field. We present a custom portable EEG-based Brain-Computer Interface (BCI) that exploits Event-Related Potentials (ERPs) induced with an oddball experimental paradigm to control the infotainment menu of a car. A preliminary evaluation of the system was performed on 10 participants in a standard laboratory setting and while driving on a closed private track. The task consisted of repeated presentations of 6 different menu icons in oddball fashion. Subject-specific models were trained with different machine learning approaches on cerebral data from either only laboratory or driving experiments (in-lab and in-car models) or a combination of the two (hybrid model) to classify EEG responses to target and non-target stimuli. All models were tested on the subjects' last in-car sessions that were not used for the training. Analysis of ERPs amplitude showed statistically significant (p < 0.05) differences between the EEG responses associated with target and non-target icons, both in the laboratory and while driving. Classification Accuracy (CA) was above chance level for all subjects in all training configurations, with a deep CNN trained on the hybrid set achieving the highest scores (mean CA = 53 ± 12 of the features importance provided by a classical BCI approach suggests an ERP-based discrimination between target and non-target responses. No statistical differences were observed between the CAs for the in-lab and in-car training sets, nor between the EEG responses in these conditions, indicating that the data collected in the standard laboratory setting could be readily used for a real driving application without a noticeable decrease in performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset