How Much Does Audio Matter to Recognize Egocentric Object Interactions?

06/03/2019
by   Alejandro Cartas, et al.
5

Sounds are an important source of information on our daily interactions with objects. For instance, a significant amount of people can discern the temperature of water that it is being poured just by using the sense of hearing. However, only a few works have explored the use of audio for the classification of object interactions in conjunction with vision or as single modality. In this preliminary work, we propose an audio model for egocentric action recognition and explore its usefulness on the parts of the problem (noun, verb, and action classification). Our model achieves a competitive result in terms of verb classification (34.26 benchmark with respect to vision-based state of the art systems, using a comparatively lighter architecture.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset