Non-Invasive Tongue-Based HCI System Using Deep Learning for Microgesture Detection

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Tongue-based Human-Computer Interaction (HCI) systems have surfaced as alternative input devices offering significant benefits to individuals with severe disabilities. However, these systems often employ invasive methods such as dental retainers, tongue piercings, and multiple mouth electrodes. These methods, due to hygiene issues and obtrusiveness, are deemed impractical for daily use. This paper presents a novel non-invasive tongue-based HCI system that utilizes deep learning for microgesture detection. The proposed system overcomes the limitations of previous methods by non-invasively detecting gestures. This is accomplished by measuring tongue vibrations via an accelerometer positioned on the Genioglossus muscle, thereby eliminating the need for in-mouth installations. The system's performance was evaluated by comparing the classification results of deep learning with four widely-used supervised machine learning algorithms, namely K-Nearest Neighbors (KNN), Support Vector Machines (SVM), Decision Trees, and Random Forests. Raw data were preprocessed in both time and frequency domains to extract relevant patterns before classification. In addition, a deep learning Convolutional Neural Network (CNN) model was trained on the raw data, leveraging its proficiency in processing time series data and capturing intricate patterns automatically using convolutional and pooling layers. The CNN model demonstrated a 97% success rate in tongue gesture detection, indicating its high accuracy. The proposed system is also low-profile, lightweight, and cost-effective, making it suitable for daily use in various contexts. This study thus introduces a non-invasive, efficient, and practical approach to tongue-based HCI systems.

References Powered by Scopus

Deep Learning for Intelligent Human–Computer Interaction

70Citations
N/AReaders
Get full text

Iot and cloud computing in health-care: A new wearable device and cloud-based deep learning algorithm for monitoring of diabetes

69Citations
N/AReaders
Get full text

Tongueboard: An oral interface for subtle input

62Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Ululate: A Non-Intrusive, Wearable Tongue Gesture Detection System for Human- Computer Interaction

0Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Jasim, D. F., & Shareef, W. F. (2023). Non-Invasive Tongue-Based HCI System Using Deep Learning for Microgesture Detection. Revue d’Intelligence Artificielle, 37(4), 985–995. https://doi.org/10.18280/ria.370420

Readers' Seniority

Tooltip

Researcher 2

67%

PhD / Post grad / Masters / Doc 1

33%

Readers' Discipline

Tooltip

Engineering 2

100%

Save time finding and organizing research with Mendeley

Sign up for free