We explore cross-modal correspondence between haptic and audio output for meditation support. To this end, we implement atmoSphere, a haptic ball to prototype several haptic/audio designs. AtmoSphere consists of a sphere shaped device which provides haptic feedback. The users can experience the design aimed at instructing them in breathing techniques shown to enhance meditation. The aim of the haptic/audio design is to guide the user into a particular rhythm of breathing. We detect this rhythm using smart eyewear (JINS MEME) that estimates cardiac and respiratory parameters using embedded motion sensors. Once this rhythm is achieved the feedback stops. If the user drops out of the rhythm, the haptic/audio feedback starts again.
CITATION STYLE
Tag, B., Mannschreck, R., Goto, T., Fushimi, H., Minamizawa, K., & Kunze, K. (2017). AtmoSphere: Mindfulness over haptic-audio cross modal correspondence. In UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers (pp. 289–292). Association for Computing Machinery, Inc. https://doi.org/10.1145/3123024.3123190
Mendeley helps you to discover research relevant for your work.