AtmoSphere: Mindfulness over haptic-audio cross modal correspondence

10Citations
Citations of this article
28Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We explore cross-modal correspondence between haptic and audio output for meditation support. To this end, we implement atmoSphere, a haptic ball to prototype several haptic/audio designs. AtmoSphere consists of a sphere shaped device which provides haptic feedback. The users can experience the design aimed at instructing them in breathing techniques shown to enhance meditation. The aim of the haptic/audio design is to guide the user into a particular rhythm of breathing. We detect this rhythm using smart eyewear (JINS MEME) that estimates cardiac and respiratory parameters using embedded motion sensors. Once this rhythm is achieved the feedback stops. If the user drops out of the rhythm, the haptic/audio feedback starts again.

Cite

CITATION STYLE

APA

Tag, B., Mannschreck, R., Goto, T., Fushimi, H., Minamizawa, K., & Kunze, K. (2017). AtmoSphere: Mindfulness over haptic-audio cross modal correspondence. In UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers (pp. 289–292). Association for Computing Machinery, Inc. https://doi.org/10.1145/3123024.3123190

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free