Leveraging Smartwatch and Earbuds Gesture Capture to Support Wearable Interaction

4Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Due to the proliferation of smart wearables, it is now the case that designers can explore novel ways that devices can be used in combination by end-users. In this paper, we explore the gestural input enabled by the combination of smart earbuds coupled with a proximal smartwatch. We identify a consensus set of gestures and a taxonomy of the types of gestures participants create through an elicitation study. In a follow-on study conducted on Amazon's Mechanical Turk, we explore the social acceptability of gestures enabled by watch+earbud gesture capture. While elicited gestures continue to be simple, discrete, in-context actions, we find that elicited input is frequently abstract, varies in size and duration, and is split almost equally between on-body, proximal, and more distant actions. Together, our results provide guidelines for on-body, near-ear, and in-air input using earbuds and a smartwatch to support gesture capture.

Cite

CITATION STYLE

APA

Rateau, H., Lank, E., & Liu, Z. (2022). Leveraging Smartwatch and Earbuds Gesture Capture to Support Wearable Interaction. Proceedings of the ACM on Human-Computer Interaction, 6(ISS). https://doi.org/10.1145/3567710

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free