Design and Implementation of EEG-Mechatronic System Interface for Computational Intelligence
The devices that can read Electroencephalography (EEG) signals have been widely used for Brain-Computer Interfaces (BCIs). Popularity in the field of BCIs has increased in recent years with the development of several consumer-grade EEG devices that can detect human cognitive states in real-time and deliver feedback to enhance human performance. Several studies are conducted to understand the fundamentals and essential aspects of EEG in BCIs. However, the significant issue of how can consumer-grade EEG devices be used to control mechatronic systems effectively has been given less attention. In this paper, we have designed and implemented an EEG BCI system using the OpenBCI Cyton headset and a user interface running a game. We employ real-world participants to play a game to gather training data that was later put into multiple machine learning models, including a linear discriminant analysis (LDA), k-nearest neighbours (KNN), and a convolutional neural network (CNN). After training the machine learning models, a validation phase of the experiment took place where participants tried to play the same game but without direct control, utilising the outputs of the machine learning models to determine how the game moved. We find that a CNN trained to the specific user playing the game performed with the highest activation accuracy from the machine learning models tested, allowing for future implementation with a mechatronic system.
READ FULL TEXT