Online vigilance analysis combining video and electrooculography features

4Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we propose a novel system to analyze vigilance level combining both video and Electrooculography (EOG) features. For one thing, the video features extracted from an infrared camera include percentage of closure (PERCLOS) and eye blinks, slow eye movement (SEM), rapid eye movement (REM) are also extracted from EOG signals. For another, other features like yawn frequency, body posture and face orientation are extracted from the video by using Active Shape Model (ASM). The results of our experiments indicate that our approach outperforms the existing approaches based on either video or EOG merely. In addition, the prediction offered by our model is in close proximity to the actual error rate of the subject. We firmly believe that this method can be widely applied to prevent accidents like fatigued driving in the future. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Du, R. F., Liu, R. J., Wu, T. X., & Lu, B. L. (2012). Online vigilance analysis combining video and electrooculography features. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7667 LNCS, pp. 447–454). https://doi.org/10.1007/978-3-642-34500-5_53

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free