SmartGrip: grip sensing system for commodity mobile devices through sound signals

10Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Although many studies have attempted to detect the hand postures of a mobile device to utilize these postures as a user interface, they either require additional hardware or can differentiate a limited number of grips only if there is a touch event on the mobile device’s screen. In this paper, we propose a novel grip sensing system, called SmartGrip, which allows a mobile device to detect different hand postures without any additional hardware and a screen touch event. SmartGrip emits carefully designed sound signals and differentiates the propagated signals distorted by different user grips. To achieve this, we analyze how a sound signal propagates from the speaker to the microphone of a mobile device and then address three key challenges: sound structure design, volume control, and feature extraction and classification. We implement and evaluate SmartGrip on three Android mobile devices. With six representative grips, SmartGrip exhibits 93.1% average accuracy for ten users in an office environment. We also demonstrate that SmartGrip operates with 83.5 to 98.3% accuracy in six different (noisy) locations. Further demonstrating the feasibility of SmartGrip as a user interface, we develop an Android application that exploits SmartGrip, validating its practical usage.

Cite

CITATION STYLE

APA

Kim, N., Lee, J., Whang, J. J., & Lee, J. (2020). SmartGrip: grip sensing system for commodity mobile devices through sound signals. Personal and Ubiquitous Computing, 24(5), 643–654. https://doi.org/10.1007/s00779-019-01337-7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free