Dual autostereoscopic display platform for multi-user collaboration with natural interaction

14Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this letter, we propose a dual autostereoscopic display platform employing a natural interaction method, which will be useful for sharing visual data with users. To provide 3D visualization of a model to users who collaborate with each other, a beamsplitter is used with a pair of autostereoscopic displays, providing a visual illusion of a floating 3D image. To interact with the virtual object, we track the user's hands with a depth camera. The gesture recognition technique we use operates without any initialization process, such as specific poses or gestures, and supports several commands to control virtual objects by gesture recognition. Experiment results show that our system performs well in visualizing 3D models in realtime and handling them under unconstrained conditions, such as complicated backgrounds or a user wearing short sleeves. © 2012 ETRI.

Cite

CITATION STYLE

APA

Kim, H., Lee, G. A., Yang, U., Kwak, T., & Kim, K. H. (2012). Dual autostereoscopic display platform for multi-user collaboration with natural interaction. ETRI Journal, 34(3), 466–469. https://doi.org/10.4218/etrij.12.0211.0331

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free