Smartphones get emotional: Mind reading images and reconstructing the neural sources

28Citations
Citations of this article
120Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Combining a wireless EEG headset with a smartphone offers new opportunities to capture brain imaging data reflecting our everyday social behavior in a mobile context. However processing the data on a portable device will require novel approaches to analyze and interpret significant patterns in order to make them available for runtime interaction. Applying a Bayesian approach to reconstruct the neural sources we demonstrate the ability to distinguish among emotional responses reflected in different scalp potentials when viewing pleasant and unpleasant pictures compared to neutral content. Rendering the activations in a 3D brain model on a smartphone may not only facilitate differentiation of emotional responses but also provide an intuitive interface for touch based interaction, allowing for both modeling the mental state of users as well as providing a basis for novel bio-feedback applications. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Petersen, M. K., Stahlhut, C., Stopczynski, A., Larsen, J. E., & Hansen, L. K. (2011). Smartphones get emotional: Mind reading images and reconstructing the neural sources. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6975 LNCS, pp. 578–587). https://doi.org/10.1007/978-3-642-24571-8_72

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free