Tackling mental health by integrating unobtrusive multimodal sensing

55Citations
Citations of this article
135Readers
Mendeley users who have this article in their library.

Abstract

Mental illness is becoming a major plague in modern societies and poses challenges to the capacity of current public health systems worldwide. With the widespread adoption of social media and mobile devices, and rapid advances in artificial intelligence, a unique opportunity arises for tackling mental health problems. In this study, we investigate how users' online social activities and physiological signals detected through ubiquitous sensors can be utilized in realistic scenarios for monitoring their mental health states. First, we extract a suite of multimodal time-series signals using modern computer vision and signal processing techniques, from recruited participants while they are immersed in online social media that elicit emotions and emotion transitions. Next, we use machine learning techniques to build a model that establishes the connection between mental states and the extracted multimodal signals. Finally, we validate the effectiveness of our approach using two groups of recruited subjects.

Cite

CITATION STYLE

APA

Zhou, D., Luo, J., Silenzio, V., Zhou, Y., Hu, J., Currier, G., & Kautz, H. (2015). Tackling mental health by integrating unobtrusive multimodal sensing. In Proceedings of the National Conference on Artificial Intelligence (Vol. 2, pp. 1401–1408). AI Access Foundation. https://doi.org/10.1609/aaai.v29i1.9381

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free