Controlling your contents with the breath: Interactive breath interface for VR, games, and animations

9Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we propose a new interface to control VR(Virtual reality) contents, games, and animations in real-time using the user’s breath and the acceleration sensor of a mobile device. Although interaction techniques are very important in VR and physically-based animations, UI(User interface) methods using different types of devices or controllers have not been covered. Most of the proposed interaction techniques have focused on screen touch and motion recognition. The direction of the breath is calculated using the position and angle between the user and the mobile device, and the control position to handle the contents is determined using the acceleration sensor built into the mobile device. Finally, to remove the noise contained in the input breath, the magnitude of the wind is filtered using a kernel modeling a pattern similar to the actual breath. To demonstrate the superiority of this study, we produced real-time interaction results by applying the breath as an external force of VR contents, games, and animations.

Cite

CITATION STYLE

APA

Kim, J. H., & Lee, J. (2020). Controlling your contents with the breath: Interactive breath interface for VR, games, and animations. PLoS ONE, 15(10 October). https://doi.org/10.1371/journal.pone.0241498

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free