Viewing direction estimation based on 3D eyeball construction for HRI

21Citations
Citations of this article
48Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Natural human-robot interaction requires leveraging viewing direction information in order to recognize, respond to, and even emulate human behavior. Knowledge of the eye gaze and point of regard gives us insight into what the subject is interested in and /or who the subject is addressing. In this paper, we present a novel eye gaze estimation approach for point-of-regard (PoG) tracking. To allow for greater head pose freedom, we introduce a new calibration approach to find the 3D eyeball location, eyeball radius, and fovea position. To estimate gaze direction, we map both the iris center and iris contour points to the eyeball sphere (creating a 3D iris disk), giving us the optical axis. We then rotate the fovea accordingly and compute our final, visual axis gaze direction. Our intention is to integrate this eye gaze approach with a dual-camera system we have developed that detects the face and eyes from a fixed, wide-angle camera and directs another active pan-tilt-zoom camera to focus in on this eye region. The final system will permit natural, non-intrusive, pose-invariant PoG estimation in distance and allow user translational freedom without resorting to infrared equipment or complex hardware setups. © 2010 IEEE.

Cite

CITATION STYLE

APA

Reale, M., Hung, T., & Yin, L. (2010). Viewing direction estimation based on 3D eyeball construction for HRI. In 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops, CVPRW 2010 (pp. 24–31). https://doi.org/10.1109/CVPRW.2010.5543784

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free