In recent times, intuitive user interfaces such as the touch panel and pen display have become widely used in PCs and PDAs. Previously, the authors developed the bright pupil camera. They subsequently developed an eye-tracking pen display based on this camera and a new aspherical model of the eye. In this paper, a robust gaze estimation method that uses a integrated-light-source camera is proposed for analyzing embodied interaction. Then, a prototype of the eye-tracking pen display was developed. The accuracy of the system was approximately 12 mm on a 15″ pen display, which is sufficient for human interaction support. © 2011 Springer-Verlag.
CITATION STYLE
Yamamoto, M., Sato, H., Yoshida, K., Nagamatsu, T., & Watanabe, T. (2011). Development of an eye-tracking pen display for analyzing embodied interaction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6771 LNCS, pp. 651–658). https://doi.org/10.1007/978-3-642-21793-7_74
Mendeley helps you to discover research relevant for your work.