Detecting Verbal and Non-Verbal Gestures Using Earables

17Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Verbal and non-verbal activities convey insightful information about people's affect, empathy, and engagement during social interactions. In this paper, we investigate the usage of inertial sensors to recognize verbal (e.g., speaking), non-verbal (e.g., head nodding, shaking) and other activities (e.g., eating, no movement). We implement an end-to-end deep neural network to distinguish among these activities. We then explore the generalizability of the approach in three scenarios: (1) using new data to detect a known activity from a known user, (2) detecting a novel activity of a known user and (3) detecting the activity of an unknown user. Results show that using accelerometer and gyroscope sensors, the model achieves a balanced accuracy of 55% when tested on data from a new user, 41% on a new activity of an existing user, and 80% on new data of a known activity from an existing user. The results are between 7-47 percentage points higher than baseline classifiers.

Cite

CITATION STYLE

APA

Laporte, M., Baglat, P., Gashi, S., Gjoreski, M., Santini, S., & Langheinrich, M. (2021). Detecting Verbal and Non-Verbal Gestures Using Earables. In UbiComp/ISWC 2021 - Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers (pp. 165–170). Association for Computing Machinery, Inc. https://doi.org/10.1145/3460418.3479322

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free