A Multimodal Emotion Sensing Platform for Building Emotion-Aware Applications
Humans use a host of signals to infer the emotional state of others. In general, computer systems that leverage signals from multiple modalities will be more robust and accurate in the same task. We present a multimodal affect and context sensing platform. The system is composed of video, audio and application analysis pipelines that leverage ubiquitous sensors (camera and microphone) to log and broadcast emotion data in real-time. The platform is designed to enable easy prototyping of novel computer interfaces that sense, respond and adapt to human emotion. This paper describes the different audio, visual and application processing components and explains how the data is stored and/or broadcast for other applications to consume. We hope that this platform helps advance the state-of-the-art in affective computing by enabling development of novel human-computer interfaces.
READ FULL TEXT