Ukemochi: A Video See-through Food Overlay System for Eating Experience in the Metaverse

11Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

The widespread use of Head-Mounted Displays (HMDs) allows ordinary users to interact with their friends daily in social Virtual Environments (VEs) or metaverse. However, it is not easy to eat in a metaverse while wearing an HMD because the Real Environment (RE) is not visible. Currently, users watch the RE's food through the gap between the user's face and the HMD (None) or superimposing a video see-through (VST) image on the VE, but these methods reduce the sense of presence. To allow natural eating in a VE, we propose Ukemochi that improves the presence and ease of eating. Ukemochi seamlessly overlays a food segmentation image inferred by deep neural networks on a VE. Ukemochi can be used simultaneously as a VE created with the OpenVR API and can be easily deployed for the metaverse. In this study, we evaluated the effectiveness of Ukemochi by comparing three visual presentation methods (None, VST, and Ukemochi) and two meal conditions (Hand condition and Plate condition). The experimental results demonstrated that Ukemochi enables users to maintain a high presence in VE and improve the ease of eating. We believe that our study will provide users with the experience of eating in the metaverse and encourage further research on eating in the metaverse.

Cite

CITATION STYLE

APA

Nakano, K., Horita, D., Isoyama, N., Uchiyama, H., & Kiyokawa, K. (2022). Ukemochi: A Video See-through Food Overlay System for Eating Experience in the Metaverse. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3491101.3519779

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free