Menu selection using auditory interface

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

An approach to auditory interaction with wearable computer is investigated. Menu selection and keyboard input interfaces are experimentally implemented by integrating pointing interface using motion sensors with auditory localization system based on HRTF. Performance of users, or the efficiency of interaction, is evaluated through experiments using subjects. The average time for selecting a menu item was approximately 5-9 seconds depending on the geometric configuration of the menu, and average key input performance was approximately 6 seconds per a character. The result did not support our expectation that auditory localization of menu items will be a helpful cue for accurate pointing. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Hirota, K., Watanabe, Y., & Ikei, Y. (2007). Menu selection using auditory interface. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4552 LNCS, pp. 70–75). Springer Verlag. https://doi.org/10.1007/978-3-540-73110-8_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free