A cross-platform framework for physics-based collaborative augmented reality

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Augmented Reality (AR) provides users with enhanced interaction experiences by allowing computer-generated virtual imagery to overlay physical objects. Here we aim to integrate desktop and handheld AR into a cross-platform environment in which personal-computer and mobile users can collaborate with each other in a shared scene to accomplish physically realistic experiences during the course of interaction. Particularly, users can intuitively pick up and move virtual objects using their hands in the desktop environment. In realizing the system, we exploit 1) a Client/Server architecture to connect different computing devices, where the server is responsible for maintaining and managing the virtual scene objects shared by users; 2) a marker-based tracking method that computes relationship between the camera view and markers; 3) a computer graphics API to render the scene in a Scene Graph structure; 4) an approach that combines hand tracking and occlusion-based interaction to estimate hand position in video frames. © 2010 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Liu, D. S. M., Yung, C. H., & Chung, C. H. (2010). A cross-platform framework for physics-based collaborative augmented reality. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6133 LNCS, pp. 80–90). https://doi.org/10.1007/978-3-642-13544-6_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free