Robust Real-Time Multi-View Eye Tracking

11/15/2017
by   Nuri Murat Arar, et al.
0

Despite significant advances in improving the gaze estimation accuracy under controlled conditions, the tracking robustness under real-world conditions, such as large head pose and movements, use of eye glasses, illumination and eye type variations, remains a major challenge in eye tracking. In this paper, we revisit this challenge and introduce a real-time multi-camera eye tracking framework to improve the estimation robustness. First, differently from previous work, we design a multi-view tracking setup that allows for acquiring multiple eye appearances simultaneously. Leveraging multi-view appearances enables to more reliably detect gaze features under challenging conditions, particularly when they are obstructed due to large head movements and glasses effects in conventional single-view appearance. The features extracted on various appearances are then used for estimating multiple gaze outputs. Second, we propose to combine the gaze outputs through an adaptive fusion mechanism in order to compute user's overall point of regard. The proposed mechanism firstly determines the estimation reliability of each gaze output according to user's momentary head pose and general gazing behavior, and then performs a reliability-based weighted fusion. We demonstrate the efficacy of our framework with extensive simulations and user experiments on a collected dataset featuring 20 subjects. Our results show that in comparison with state-of-the-art eye trackers, the proposed framework provides not only a significant enhancement in accuracy but also a notable robustness. Our prototype system runs at 30 frames-per-second (fps) and achieves 1 accuracy under challenging experimental scenarios, which makes it suitable for various high-precision demanding applications.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset