Recognition of Activities from Eye Gaze and Egocentric Video

05/18/2018
by   Anjith George, et al.
0

This paper presents a framework for recognition of human activity from egocentric video and eye tracking data obtained from a head-mounted eye tracker. Three channels of information such as eye movement, ego-motion, and visual features are combined for the classification of activities. Image features were extracted using a pre-trained convolutional neural network. Eye and ego-motion are quantized, and the windowed histograms are used as the features. The combination of features obtains better accuracy for activity classification as compared to individual features.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset