Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets

6Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Gaze interaction affords hands-free control of computers. Pointing to and selecting small targets using gaze alone is difficult because of the limited accuracy of gaze pointing. This is the first experimental comparison of gaze-based interface tools for small-target (e.g. <12×12 pixels) point-and-select tasks. We conducted two experiments comparing the performance of dwell, magnification and zoom methods in point-and-select tasks with small targets in single- and multiple-target layouts. Both magnification and zoom showed higher hit rates than dwell. Hit rates were higher when using magnification than when using zoom, but total pointing times were shorter using zoom. Furthermore, participants perceived magnification as more fatiguing than zoom. The higher accuracy of magnification makes it preferable when interacting with small targets. Our findings may guide the development of interface tools to facilitate access to mainstream interfaces for people with motor disabilities and other users in need of hands-free interaction. © 2011 Taylor & Francis.

Cite

CITATION STYLE

APA

Skovsgaard, H., Mateo, J. C., & Hansen, J. P. (2011). Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets. Behaviour and Information Technology, 30(6), 821–831. https://doi.org/10.1080/0144929X.2011.563801

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free