Real-time camera planning for navigation in virtual environments

20Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this work, we have developed a real-time camera control module for navigation in virtual environments. With this module, the tracking motion of a third-person camera can be generated automatically to allow a user to focus on the control of an avatar. The core of this module consists of a motion planner that uses the probabilistic roadmap method and a lazy update strategy to generate the motion of the camera, possibly with necessary intercuts. A dynamic roadmap specified relative to the avatar is updated in real time within a time budget to account for occlusions in every frame of the control loop. In addition, the planner also allows a user to specify preferences on how the tracking motion is generated. We will use several examples to demonstrate the effectiveness of this real-time camera planning system. ©Springer-Verlag Berlin Heidelberg 2008.

Cite

CITATION STYLE

APA

Li, T. Y., & Cheng, C. C. (2008). Real-time camera planning for navigation in virtual environments. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5166 LNCS, pp. 118–129). https://doi.org/10.1007/978-3-540-85412-8_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free