Emotion Recognition Through Observer's Physiological Signals

02/19/2020
by   Yang Liu, et al.
0

Emotion recognition based on physiological signals is a hot topic and has a wide range of applications, like safe driving, health care and creating a secure society. This paper introduces a physiological dataset PAFEW, which is obtained using movie clips from the Acted Facial Expressions in the Wild (AFEW) dataset as stimuli. To establish a baseline, we use the electrodermal activity (EDA) signals in this dataset and extract 6 features from each signal series corresponding to each movie clip to recognize 7 emotions, i.e., Anger, Disgust, Fear, Happy, Surprise, Sad and Neutral. Overall, 24 observers participated in our collection of the training set, including 19 observers who participated in only one session watching 80 videos from 7 classes and 5 observers who participated multiple times and watched all the videos. All videos were presented in an order balanced fashion. Leave-one-observer-out was employed in this classification task. We report the classification accuracy of our baseline, a three-layer network, on this initial training set while training with signals from all participants, only single participants and only multiple participants. We also investigate the recognition accuracy of grouping the dataset by arousal or valence, which achieves 68.66 Finally, we provide a two-step network. The first step is to classify the features into high/low arousal or positive/negative valence by a network. Then the arousal/valence middle output of the first step is concatenated with feature sets as input of the second step for emotion recognition. We found that adding arousal or valence information can help to improve the classification accuracy. In addition, the information of positive/negative valence boosts the classification accuracy to a higher degree on this dataset.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset