Learning Behavioral Representations from Wearable Sensors

3Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Continuous collection of physiological data from wearable sensors enables temporal characterization of individual behaviors. Understanding the relation between an individual’s behavioral patterns and psychological states can help identify strategies to improve quality of life. One challenge in analyzing physiological data is extracting the underlying behavioral states from the temporal sensor signals and interpreting them. Here, we use a non-parametric Bayesian approach to model sensor data from multiple people and discover the dynamic behaviors they share. We apply this method to data collected from sensors worn by a population of hospital workers and show that the learned states can cluster participants into meaningful groups and better predict their cognitive and psychological states. This method offers a way to learn interpretable compact behavioral representations from multivariate sensor signals.

Cite

CITATION STYLE

APA

Tavabi, N., Hosseinmardi, H., Villatte, J. L., Abeliuk, A., Narayanan, S., Ferrara, E., & Lerman, K. (2020). Learning Behavioral Representations from Wearable Sensors. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12268 LNCS, pp. 245–254). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-61255-9_24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free