Brain2Object: Printing Your Mind from Brain Signals with Spatial Correlation Embedding
Electroencephalography (EEG) signals are known to manifest differential patterns when individuals visually concentrate on different objects (e.g., a car). In this work, we present an end-to-end digital fabrication system , Brain2Object, to print the 3D object that an individual is observing by solely decoding visually-evoked EEG brain signal streams. We propose a unified training framework which combines multi-class Common Spatial Pattern and deep Convolutional Neural Networks to support the backend computation. Specially, a Dynamical Graph Representation of EEG signals is learned for accurately capturing the structured spatial correlations of EEG channels in an adaptive manner. A user friendly interface is developed as the system front end. Brain2Object presents a streamlined end-to-end workflow which can serve as a template for deeper integration of BCI technologies to assist with our routine activities. The proposed system is evaluated extensively using offline experiments and through an online demonstrator. For the former, we use a rich widely used public dataset and a limited but locally collected dataset. The experiment results show that our approach consistently outperforms a wide range of baseline and state-of-the-art approaches. The proof-of-concept corroborates the practicality of our approach and illustrates the ease with which such a system could be deployed.
READ FULL TEXT