3D interaction with virtual objects in a precisely-aligned view using a see-through mobile AR system

6Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we propose a system that enables users to interact with virtual objects that are displayed on a mobile display in a precisely-aligned view using their hands. By projecting a 3D scene obtained by a depth camera according to the user's viewpoint position, the scene including the user's hand displayed on the screen appears seamlessly connected to the actual scene outside the screen, which enables natural interaction with virtual objects through the screen. We conducted an experiment to evaluate the positional accuracy in the presented images. The maximum mean error was 8.60 mm, and the maximum standard deviation was 1.69 mm, which could be improved by further refinement. We also conducted an experiment to evaluate the usability of the system. We asked the participants to perform tasks using the proposed system in the aligned and non-aligned see-through modes. Despite some restrictions in our prototype system, 9 out of 14 participants completed the task faster in the aligned see-through mode. This result shows the future potential of the proposed system in interaction with virtual objects.

Cite

CITATION STYLE

APA

Unuma, Y., & Komuro, T. (2017). 3D interaction with virtual objects in a precisely-aligned view using a see-through mobile AR system. ITE Transactions on Media Technology and Applications, 5(2), 49–56. https://doi.org/10.3169/mta.5.49

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free