Interfaces that allow users to interact with a computing system by using free-hand mid-air gestures are becoming increasingly prevalent. A typical shortcoming of such gesture-based interfaces, however, is their lack of a haptic component. One technology with the potential to address this issue, is ultrasound mid-air haptic feedback. At the moment, haptic sensations are typically designed by system engineers and experts. In the case of gestural interfaces, researchers started involving non-expert users to define suitable gestures for specific interactions. To our knowledge, no studies have involved – from a similar participatory design perspective – laymen to generate mid-air haptic sensations. We present the results of an end-user elicitation study yielding a user-defined set of mid-air haptic sensations to match gestures used for interacting with an Augmented Reality menu environment. In addition, we discuss the suitability of the end-user elicitation method to that end.
CITATION STYLE
Van den Bogaert, L., & Geerts, D. (2020). User-defined mid-air haptic sensations for interacting with an ar menu environment. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12272 LNCS, pp. 25–32). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-58147-3_3
Mendeley helps you to discover research relevant for your work.