Emotion recognition involving physiological and speech signals: A comprehensive review

51Citations
Citations of this article
67Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Emotions play an extremely important role in how we make decisions, in planning, in reasoning, and in other human mental states. The recognition of a driver’s emotions is becoming a vital task for advanced driver assistance systems (ADAS). Monitoring drivers’ emotions while driving offers drivers important feedback that can be useful in preventing accidents. The importance comes from the fact that driving in aggressive moods on road leads to traffic accidents. Emotion recognition can be achieved by analyzing facial expression, speech, and various other biosignals such as electroencephalograph (EEG), blood volume pulse (BVP), electrodermal skin resistance (EDA), electrocardiogram (ECG), etc. In this chapter, a comprehensive review of the state of-the-art methodologies for emotion recognition based on physiological changes and speech is presented. In particular, we investigate the potential of physiological signals and driver’s speech for emotion recognition and their requirements for ADAS. All steps of an automatic recognition system are explained: emotion elicitation, data preprocessing such as noise and artifacts removal, features extraction and selection, and finally classification.

Cite

CITATION STYLE

APA

Ali, M., Mosa, A. H., Machot, F. A., & Kyamakya, K. (2018). Emotion recognition involving physiological and speech signals: A comprehensive review. In Studies in Systems, Decision and Control (Vol. 109, pp. 287–302). Springer International Publishing. https://doi.org/10.1007/978-3-319-58996-1_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free