Sensor and Feature Level Fusion of Thermal Image and ECG Signals in Recognizing Human Emotions

N/ACitations
Citations of this article
1Readers
Mendeley users who have this article in their library.

Abstract

Recent studies on recognition of various emotion labels concentrated on speech signals, text, visual images and anatomical variables. The proposed system combines the features of ECG which are extracted using empirical mode decomposition and features of thermal images which are extracted from Gray Level Co-occurrence Matrix (GLCM) viz energy, contrast, homogeneity and correlation. ECG is acquired from AD8232 module and thermal images from FLUKE TiS20. Data of ECG and thermal images are acquired simultaneously from a subject and database consists of data from 40 subjects in age group of 20 years to 40 years from Hassan, Karnataka, India. Here different labels of emotions have been classified based on K-nearest neighbor decision rule. This system yielded highest accuracy for disgust and lowest for anger using ECG and highest accuracy for disgust and surprise and least for sad.

Cite

CITATION STYLE

APA

Sensor and Feature Level Fusion of Thermal Image and ECG Signals in Recognizing Human Emotions. (2019). International Journal of Innovative Technology and Exploring Engineering, 9(2S), 78–82. https://doi.org/10.35940/ijitee.b1054.1292s19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free