Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze Pointing

5Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

We investigate silent speech as a hands-free selection method in eye-gaze pointing. We first propose a stripped-down image-based model that can recognize a small number of silent commands almost as fast as state-of-the-art speech recognition models. We then compare it with other hands-free selection methods (dwell, speech) in a Fitts' law study. Results revealed that speech and silent speech are comparable in throughput and selection time, but the latter is significantly more accurate than the other methods. A follow-up study revealed that target selection around the center of a display is significantly faster and more accurate, while around the top corners and the bottom are slower and error prone. We then present a method for selecting menu items with eye-gaze and silent speech. A study revealed that it significantly reduces task completion time and error rate.

Cite

CITATION STYLE

APA

Pandey, L., & Arif, A. S. (2022). Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze Pointing. Proceedings of the ACM on Human-Computer Interaction, 6(ISS), 328–353. https://doi.org/10.1145/3567723

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free