Querying multiple video streams and hypermedia objects of a video-based virtual space system

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

By using several digital video cameras with omni-directional sensors, complete views of the activities at some place can be recorded as omni-directional videos. We utilize this technique for developing the Retrax System, which provides video-based virtual space environments as part of the Digital City infrastructure. The system aims to realize applications that are not possible with conventional video database systems, such as human activities and communications on computer networks based on recorded video data. This system archives the activities at some place in the real world and provides a virtual space based on the archived data for collaboration among users, such as walking, annotating, and communication. The user interface of the system bridges the real and virtual spaces or past and current human activities. This paper describes the purpose and usage of the system, and then introduces its architecture, data models, and the query language. Because both video data and hypermedia data are essential to the system, a query mechanism dealing with both types of data is required. The proposed query language is designed as an extension of SQL and realizes flexible management of both types of data. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Yokota, Y., He, S., & Kambayashi, Y. (2005). Querying multiple video streams and hypermedia objects of a video-based virtual space system. In Lecture Notes in Computer Science (Vol. 3081, pp. 299–309). Springer Verlag. https://doi.org/10.1007/11407546_17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free