Analysis of human-machine interaction through facial expression and hand-gesture recognition

3Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper focuses on a review of recent work on facial expression and hand gesture recognitions. Facial expressions and hand gestures are used to express emotions without oral communication. The human brain has the ability to identify the emotions of persons using expressions or hand gestures within a fraction of a second. Research has been conducted on human-machine interactions (HMIs), and the expectation is that systems based on such HMI algorithms should respond similarly. Furthermore, when a person intends to express emotions orally, he or she automatically uses complementary facial expressions and hand gestures. Extant systems are designed to express these emotions through HMIs without oral communication. Other systems have added various combinations of hand gestures and facial expressions as videos or images. The meaning or emotions conveyed by particular hand gestures and expressions are predefined in these cases. Accordingly, the systems were trained and tested. Further, certain extant systems have separately defined the meanings of such hand gestures and facial expressions.

Cite

CITATION STYLE

APA

Rokade, R., Kshirsagar, K., Sonawane, J., & Munde, S. (2019). Analysis of human-machine interaction through facial expression and hand-gesture recognition. International Journal of Innovative Technology and Exploring Engineering, 8(9 Special Issue 3), 1320–1328. https://doi.org/10.35940/ijitee.i3287.0789s319

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free