EDDIE: An embodied AI system for research and intervention for individuals with ASD

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

We report on the ongoing development of EDDIE (Emotion Demonstration, Decoding, Interpretation, and Encoding), an interactive embodied AI to be deployed as an intervention system for children diagnosed with High-Functioning Autism Spectrum Disorders (HFASD). EDDIE presents the subject with interactive requests to decode facial expressions presented through an avatar, encode requested expressions, or do both in a single session. Facial tracking software interprets the subject's response, and allows for immediate feedback. The system fills a need in research and intervention for children with HFASD by providing an engaging platform for presentation of exemplar expressions consistent with mechanical systems of facial action measurement integrated with an automatic system for interpreting and giving feedback to the subject's expressions. Both live interaction with EDDIE and video recordings of human- EDDIE interaction will be demonstrated.

Cite

CITATION STYLE

APA

Selkowitz, R., Rodgers, J., Moskal, P. J., Mrowczynski, J., & Colson, C. (2016). EDDIE: An embodied AI system for research and intervention for individuals with ASD. In 30th AAAI Conference on Artificial Intelligence, AAAI 2016 (pp. 4385–4386). AAAI press. https://doi.org/10.1609/aaai.v30i1.9845

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free