Smartwatch user interface implementation using CNN-based gesture pattern recognition

32Citations
Citations of this article
61Readers
Mendeley users who have this article in their library.

Abstract

In recent years, with an increase in the use of smartwatches among wearable devices, various applications for the device have been developed. However, the realization of a user interface is limited by the size and volume of the smartwatch. This study aims to propose a method to classify the user’s gestures without the need of an additional input device to improve the user interface. The smartwatch is equipped with an accelerometer, which collects the data and learns and classifies the gesture pattern using a machine learning algorithm. By incorporating the convolution neural network (CNN) model, the proposed pattern recognition system has become more accurate than the existing model. The performance analysis results show that the proposed pattern recognition system can classify 10 gesture patterns at an accuracy rate of 97.3%.

Cite

CITATION STYLE

APA

Kwon, M. C., Park, G., & Choi, S. (2018). Smartwatch user interface implementation using CNN-based gesture pattern recognition. Sensors (Switzerland), 18(9). https://doi.org/10.3390/s18092997

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free