Real-time Pupil Tracking from Monocular Video for Digital Puppetry

06/19/2020
by   Artsiom Ablavatski, et al.
0

We present a simple, real-time approach for pupil tracking from live video on mobile devices. Our method extends a state-of-the-art face mesh detector with two new components: a tiny neural network that predicts positions of the pupils in 2D, and a displacement-based estimation of the pupil blend shape coefficients. Our technique can be used to accurately control the pupil movements of a virtual puppet, and lends liveliness and energy to it. The proposed approach runs at over 50 FPS on modern phones, and enables its usage in any real-time puppeteering pipeline.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset