Virtual reality-based facial expressions understanding for teenagers with autism

6Citations
Citations of this article
42Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Technology-enabled intervention has the potential to individualize and improve outcomes of traditional intervention. Specifically, virtual reality (VR) technology has been proposed in the virtual training of core social and communication skills that are impaired in individuals with autism. Various studies have demonstrated that children with autism have slow and atypical processing of emotional faces, which could be due to their atypical underlying neural structure. Emotional face recognition is considered among the core building blocks of social communication and early impairment in this skill has consequence on later complex language and communication skills. This work proposed a VR-based facial emotion recognition mechanism in the presence of contextual storytelling. Results from a usability study support the idea that individuals with autism may employ different facial processing strategies. The results are discussed in the context of the applicability of multimodal processing to enable adaptive VR-based systems in delivering individualized intervention. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Bekele, E., Zheng, Z., Swanson, A., Davidson, J., Warren, Z., & Sarkar, N. (2013). Virtual reality-based facial expressions understanding for teenagers with autism. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8010 LNCS, pp. 454–463). https://doi.org/10.1007/978-3-642-39191-0_50

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free